North Star

Successful companies usually have a secret sauce. It could be an algorithm or an insight. But whatever that secret sauce is, it is used to create or disrupt a market.

Apple created the PC market when Steve and Steve figured out that affordable pre-built personal computers would be really useful for consumers. IBM disrupted the PC market that Apple built with the insight that a standard, expandable, business-oriented PC would be especially valuable to businesses. After a while Microsoft disrupted the disrupter with the key insight that PC resources, CPU speed, RAM size, and disk space, were essentially infinite according to Moore’s Law.

Yet secret sauce alone is not enough create or disrupt a market for very long. You might have a brilliant algorithm or insight but if you can’t focus on it and deliver it to your audience then you got nothing.

Secret sauces are a common and cheap. The ability to focus and deliver is rare and expensive!

Let’s take the case of Google. Larry and Sergey started Google with the idea of Page Rank. They turned that idea into a set of algorithms and code and turned it loose on the web. Suddenly Larry and Sergey had the best search engine on the market.

But Page Rank on its own didn’t create Google. This might be hard to believe today but when started Google it was an underdog. Google was the epitome of a scrappy startup that hardly anyone paid any attention to.

Luckily Larry and Sergey had something else: A north star.

I don’t know if they called it a “north star”. That’s what we call it now. They probably didn’t call it anything. Looking back, I think Larry and Sergey, Like Steve and Steve, and all successful market creators/disrupters had an intuitive sense of focus and delivery that was superhuman. They got everyone around them, investors, employees, and partners, to focus on search and to think hard about the best way to deliver search to the consumer. They followed their north start to the detriment of everything else including sleep, civility, and revenue.

Obviously it paid off. Once the nascent search market was disrupted Google attained all the things they had sacrificed. They made money. They decided to be really nice. They got a good night’s sleep.

I see this pattern repeating though out the boom and bust cycle of business. When a company is following it’s north star it eventually becomes successful. When a company is distracted or tries to follow too many stars it eventually fails.

When I worked at Apple in the 90s our north start was summed up in the question, “will it sell more Macintoshes?” If you could answer “yes” then you had tacit approval to do it. Don’t ask. Just do it. HyperCard, QuickTime, TrueType, Unicode, these are all examples of technologies that “sold more Macintoshes.”

At the time I was working on ClarisWorks for Kids. It was a bit like Microsoft Office for the K-12 market. Our theory was that productivity software tools for kids would sell more Macintoshes (to parents and schools) and so I was asked to go and do it. I didn’t fill out a product plan or forecast revenue. I just convinced a group of engineers that ClarisWorks of Kids was cool and off we went. I hired as many people as I needed. I figured out features and even helped design the box art. Since I had a north star, I didn’t have to be managed. My boss was more like my personal coach. I went to him for advice and not orders.

Since I had never shipped a product before I made a few mistakes. I didn’t get fired. As long as I was following Apple’s north star everyone had trust and confidence in what I was doing. And I wasn’t special. I was one of hundreds of Apple engineering managers leading projects in partnership with hundreds of engineers all following a single north star.

ClarisWorks for Kids turned out to be a big hit. We won some awards. More importantly we sold a lot of Macintoshes. ClarisWorks for Kids was part of an  educational bundle that filled up computer classrooms across the world with Power PC-based Power Macs.

But then we turned away from our north star.

In the late 1990’s Apple’s marketshare continued to slip. In spite of all our focus and smart insights we were not sell enough Macintoshes. Risc chips, CD-ROMs, and built-in digital signal processors were not cutting it with the consumer. Most people bought IBM compatible PCs that ran Windows.

Instead of doubling down on our north star or discovering a new north star we at Apple decided to pursue many different strategies. Sometime we would follow multiple strategies at the same time but usually it was a new strategy every month. Some of these new “stars” included “Mac is the best PC” and “Let’s find more ways to make money from our existing users” and “Apple is really a software company!” Ouch. None of these stars become north stars. They were more like fly-by-night comets that burnout by dawn.

Without a strong north star, I no longer manage myself. I had to be told what to do. Once day I was told to “port Claris Works for Kids to Windows.” I asked how this project would “sell more Macintoshes?” Apparently Apple wasn’t concerned about that old idea any more and frankly I had not been asked for an opinion.

So we gritted our teeth and cracked open the Windows 3.1 disks and started porting. It was kinda of fun and a huge technical challenge as the Mac programming model was very different from Windows. So we dug into it. As an engineering manager there wasn’t as much for me to do so I got into project plans and status reports. I don’t think anyone read them. At some point we were done. ClarisWorks for Kids could now run under Windows on IBM PCs.

This is the point where we were all laid off. Nothing personal. Business was bad, new management was in town (Steve was back), and Windows software was not needed. It didn’t “sell more Macintoshes” because it didn’t run on a Macintosh.

After we were gone Apple got back in the business of following it’s original and true north star. Mac computers become exciting again with bold design and a new UNIX-based operating system. (OK an old UNIX-based OS but it brought the goodness of UNIX to a mass market.)

ClarisWorks and ClarisWorks for Kids were gone but Apple replaced them with a suite of productivity tools. Pages, Keynote, and Numbers are the great-grandchildren of ClarisWorks. I don’t know if they “sell more Macintoshes” but they have some cool features. Besides, Apple’s north star now is probably “Does it sell more iPhones?” or something like that.

These days I work really hard to provide a north star to my teams and to advocate for a north star in my organization. A good north star is easy to understand and easy to remember. A great north star enables employees to mange themselves and renders budgets and project plans obsolete. An awesome north star fuels growth and turns businesses around.

 

Trolls Are USA

It’s clear that Americans are more divided than ever. Our self-segregating tendencies have been reinforced by the adoption of Internet technologies and algorithms that personalize our newsfeeds to the point that we walk side-by-side down the same streets in different mental worlds.

Before the web, before iPhone, Netflix, and Facebook, the physical limits of radio, television, and print technology meant that we had to share. We had to share the airwaves and primetime and the headlines because they were limited resources.

In the pre-Internet world print was the cheapest communication to scale and thus the most variable. Anyone with a few hundred bucks could print a newsletter but these self-published efforts were clearly inferior to the major newspapers. You could tell yellow journalism from Pulitzer winners just by the look of the typography and feel of the paper in your hands. This was true with books and magazines as well. Quality of information was for the most part synonymous with quality of production.

To put on a radio or TV show you had to be licensed and you needed equipment and technical skills from unionized labor. Broadcast was more resource intensive and thus more controlled than print and thus more trusted. In 1938 The War of Worlds radio drama fooled otherwise skeptical Americans into believing they were under attack by Martian invaders. The audience was fooled because the show was presented not as a radio play but a series of news bulletins breaking into otherwise regularly scheduled programming.

The Broadcast technologies of the pre-social media world coerced us into consensus. We had to share them because they were mass media, one-to-many communications where the line between audience and broadcaster was clear and seldom crossed.

Then came the public Internet and the World Wide Web of decentralized distribution. Then came super computers in our pockets with fully equipped media studios in our hands. Then came user generated content, blogging, and tweeting such that there were as many authors as there were audience members. Here the troll was born.

Before the Internet the closest we got to trolling was the prank phone call. I used to get so many prank phone calls as high schooler in the 1970s that I simply answered the phone with a prank: “FBI HQ, Agent Smith speaking, how may I direct your call?” Makes me crack up to this day!

If you want to blame some modern phenomenon for the results of the 2016 presidential election, and not the people who didn’t vote, or the flawed candidates, or the FBI shenanigans, then blame the trolls. You might think of the typical troll as a pimply-faced kid in his bedroom with the door locked and the window shades taped shut but those guys are angels compared to the real trolls: the general public. You and me.

Every time you share a link to a news article you didn’t read (which is something like 75% of the time), every time you like a post without critically thinking about it (which is almost always), and every time you rant in anger or in anxiety in your social media of choice you are the troll.

I can see that a few of my favorite journalists and Facebook friends want to blame our divided culture, the spread of misinformation, and the outcome of the election on Facebook. But that’s like blaming the laws of thermal dynamics for a flood or the laws of motion for a car crash. Facebook, and social media in general, was the avenue of communication not the cause. In technology terms, human society is a network of nodes (people) and Facebook, Google, and Twitter are applications that provide easy distribution of information from node to node. The agents that cause info to flow between the social network nodes are human beings not algorithms.

It’s hard not to be an inadvertent troll. I don’t have the time to read and research every article that a friend has shared with me. I don’t have the expertise to fact-check and debunk claims outside of my area of expertise. Even when I do share an article about a topic I deeply understand, it’s usually to get a second opinion.

From a tech perspective, there are a few things Facebook, Google, and Twitter can do to keep us from trolling each other. Actually, Google is already doing most of these things with their Page Rank algorithms and quality scores for search results. Google even hires human beings to test and verify the results of their search results. Thus, it’s really hard for us to troll each other with phony web pages claiming to be about cats when dogs are the topic. Kudos to Google!

The following advice is for Facebook and Twitter from admiring fan…

First, hire human editors. You’re a private company not a public utility. You can’t be neutral, you are not neutral, so stop pretending to be neutral. I don’t care which side you pick, just pick a side, hire some college educated, highly opinionated journalists, and edit our news feeds.

Second, give us a “dislike” button and along with it “true” and “false” buttons. “Like” or “retweet” are not the only legitimate responses that human beings have to news. I like the angry face and the wow face but those actions are feelings and thus difficult to interpret clearly in argumentation and discourse. Dislike, true, and false would create strong signals that could help drive me and my friends to true consensus through real conversations.

Third, give us a mix of news that you predict we would like and not like. Give us both sides or all sides. And use forensic algorithms to weed out obvious trash like fake news sites, hate groups with nice names, and teenagers pretending to be celebrities.

A/B test these three ideas, and better ones, and see what happens. My bet is social media will be a healthier place but a small place with less traffic driven by the need to abuse each other.

We’ll still try to troll the hell out of each other but it will be more time consuming. Trolling is part of human nature and so is being lazy. So just make it a little harder to troll.

Before social media our personal trolling was limited to the dinner table or the locker room. Now our trolling knows no bounds because physical limits don’t apply on the Internet. We need limits, like spending limits on credit cards, before we troll ourselves to death.

Faceless Phone

About twelve years ago I attended a management leadership training offsite and received a heavy glass souvenir. When I got home after the event I put that thingamabob, which officially is called a “tombstone”, up on a shelf above my desk. Little did I know that after more than a decade of inert inactivity that souvenir would launch me into the far future of the Internet of Things with an unexpected thud.

Last night before bed I set my iPhone 6 Plus down on my desk and plugged it in for charging. Then I reach up to the shelf above to get something for my son and BANG! The tombstone leapt off the shelf and landed on my desk. It promptly broke in half and smashed the screen of my iPhone. In retrospect I see now that storing heavy objects above one’s desk is baiting fate and every so often fate takes the bait.

I’ve seen many people running around the streets of Manhattan with cracked screens. My screen was not just cracked. It was, as the kids say, a crime scene. I knew that procrastination was not an option. This phone’s face was in ruins and I needed to get it fixed immediately.

No problem! There are several wonderful Apple Stores near me and I might even have the phone covered under Apple Care. Wait! There was a problem! I had several appointments in the morning and I wasn’t getting to any Apple Stores until late afternoon.

Why was this a big deal? Have you tried to navigate the modern world without your smart phone lately? No music, no maps, no text messages! Off the grid doesn’t begin to cover it! My faceless phone was about to subject me to hours of isolation, boredom, and disorientation!

Yes, I know, a definitive first world problem. Heck! I lived a good 20 years before smart phones became a thing. I could handle a few hours without podcasts, Facebook posts, and Pokemon Go.

In the morning I girded my loins, which is what one does when one’s iPhone is smashed. I strapped on my Apple Watch and sat down at my desk for a few hours of work-related phone calls, emails, and chat messages.

Much to my surprise even though I could not directly access my phone almost all of it features and services were available. While the phone sat on my desk with a busted screen its inner workings were working just fine. I could make calls and text messages with my watch, with my iMac, and with voice commands. I didn’t have to touch my phone to use it! I could even play music via the watch and listen via bluetooth headphones. I was not cut off from the world!

(Why do these smart phones have screens anyway?)

Around lunch time I had to drive to an appointment and I took the faceless phone with me. I don’t have Apple Carplay but my iPhone synch up fine with my Toyota’s entertainment system. Since I don’t look at my phone while driving the cracked screen was not an issue. It just never dawned on me before today that I don’t have to touch the phone to use it.

I imagine that our next paradigm shift will be like faceless phones embedded everywhere. You’ll have CPUs and cloud access in your wrist watch, easy chair, eye glasses, and shoes. You’ll have CPUs and cloud access in your home, car, office, diner, and shopping mall. You’ll get text messages, snap pictures, reserve dinner tables, and check your calendar without looking at a screen.

Now, we’re not quite there yet. I couldn’t use all the apps on my phone without touching them. In fact I could only use the a limited set of the built-in apps and operating system features that Apple provides. I had to due without listening to my audiobook on Audible and I couldn’t catch any Pokemon. Siri and Apple Watch can’t handle those third party app tasks yet.

But we’re close. This means the recent slow down in smart phone sales isn’t the herald of hard tech times. Its just the calm before the gathering storm of the next computer revolution. This time the computer in your pocket will move to the clouds. Apple will be a services company! (Google, Facebook, and Amazon too!) Tech giants will become jewelry, clothing, automobile, and housing companies.

Why will companies like Apple have to stop making phones and start making mundane consumer goods like cufflinks and television sets to shift us into the Internet of Things?

Because smooth, flawless integration will be the new UX. Today user experience is all about a well designed screen. In the IoT world, which I briefly and unexpectedly visited today, there won’t be any user interface to see. Instead the UX will be embedded in the objects we touch, use, and walk through.

There will still be some screens. Just as today we still have desktop computers for those jobs that voice control, eye rotations, and gestures can’t easily do. But the majority of consumers will use apps without icons, listen to playlists without apps, and watch videos without websites.

In the end I did get my iPhone fixed. But I’m going to keep visiting the IoT future now that I know how to find it.

First Day of the Year

Welcome to 2016 day one. Imagine if on today we could accurately predict what will happen in 2016? We could write a blog post with predictions and then gloat when they all come true!

Here are some of the outcomes I would like to be able to predict:

  • Which movie will win best picture?
  • Which candidates will win the democratic and republican nominations and from there win the Whitehouse?
  • Which football team will win the Super Bowl and which baseball team will win the World Series?
  • Which stocks should be bought and which should be sold?

But it’s hard to predict questions like these for several reasons. We don’t have all the facts and we don’t know how to rank the facts we do have. The facts can and most likely will change. And even if we have everything we need to make an accurate prediction, it would still only be a probability and even if an outcome is 99.999% likely to happen there is still a slim chance, 0.0001%, that it won’t happen.

One approach to predictions is to use the wisdom of the commons and just ask people what they think. This how opinion polls work. The problem here is that much of the time people don’t know their own opinions and how a question is asked creates bias towards an answer. Not to mention that people just change their minds over time which makes for stale predictions.

Another approach is to use the wisdom of the market and create a marketplace where people can bet on outcomes. This is really what the stock market is. The prices of Apple, Microsoft, or Alphabet shares aren’t a valuation of what those companies are worth today but what they will be worth at some point in the future. Sadly, the stock market has a spotty record at predicting the future health and success of a company.

And you can always ask an expert, usually the least accurate way to make a prediction, what she thinks is going to happen. There are enough experts out there that one or two, out of hundreds or thousands, ends up getting lucky and predicting accurate outcomes. There’s a movie out now about how 3-4 people predicted the mortgage crisis of 2008. Sometimes even if you know the future other people are not going to listen. They can’t! They are too invested in the present to make the changes needed to avoid catastrophe. And a thousand years from now we might lean that the financial meltdown in 2008 prevented a worse outcome!

Alan Kay and Abraham Lincoln both said “The best way to predict the future is to create/invent it.”

Given the difficulties involved in making accurate and reliable predictions and the nature of probability it best not to focus on guessing the future. IT is a more productive activity to help bring about the future that you want happen. Both Kay and Lincoln were pretty smart guys!

So here are some of the things I’d like to make happen in 2016…

  • I’d like the Internet to go faster so I’m going to do my best to speed up the performance of websites, mobile apps, and services I’m responsible for. Waiting for resources to load is killing all of us. We don’t need new tools and frameworks to speed up the Internet. We just need to do our jobs better!
  • I’d like there to be less misinformation and more accurate information available on the Internet so I’m going to encourage thoughtful, civil, responsible people to blog and post more. Maybe that will crowd out some of the noise.
  • I’d like more people to enjoy Math and Science and coding so I’m going to be more an advocate of learning Calculus in middle age, keeping up with Science at any age, and learning to code from non-Computer Science backgrounds. (I love music, novels, and movies but Geometry and Algorithms deserve appreciation too!)

I predict these tasks will be tough but I’ll make some progress—especially since a whole lot of other people are working toward the same goals.

 

Quick Thoughts Apple Watch Sport, AppleTV, Magic Trackpad 2, iPad Pro

This year I had a lot of Apple product to buy. Other than buying a new iPhone every couple of years the rest of my Apple gear didn’t need updating. iMac, MacBooks, and iPads got a little faster, a little thinner, and a little more expensive but not so much that I really needed break down and acquire new ones. Being an Apple fan is an expensive hobby so it was kind of nice to have nothing new to buy. But then came 2015 and all these new toys!

Apple Watch Sport

Positives: I wear it and use it every day! I like the calendar, messaging, and fitness notifications. The iPhone and Apple Watch are very well integrated. It’s great to respond to messages and phone calls without taking out my phone. (I feel a little silly talking like Dick Tracy to my wrist.) I did like the game LifeLine (which is well integrated as a text adventure game) for a little while. I have a nice collection of wrist bands.

Negatives: I’ve turned off 90% of app notifications. None of the 3rd party apps, expect HipChat, have been useful. There is a lag when accessing some apps that makes me impatient. Charging the watch with the disk is a little weird. The sport wristbands bothered my skin so I’ve switched to an inexpensive leather band. I’d like to see more games like LifeLine.

AppleTV

Positives: The whole family loves it. Crossy Road was a big hit and the first time we’ve gathered in front of the TV to play a game since before the kids graduated from High School. The user experience is excellent. Apple Music and Photo on the big screen are awesome. AppleTV is our go to Netflix, Hulu, and HBO Go tool. I want to write an a game for it!

Negatives: I don’t have a 4K TV but I’m worried that AppleTV doesn’t support 4K. (Is that irrational FOMO?) Most of the AppleTV apps are not exciting us. The remote is hard to deal with except when playing a game.

Magic Trackpad 2

Positives: No more environment killing batteries required. The force touch feature is cool for previewing web pages from links. It bigger and more comfortable for gestures.

Negatives: I keep forgetting to use force touch.

iPad Pro

Positives: For me the iPad Pro is the break out hit of Apple’s current product line. The Smart Keyboard is not terrible and the Apple Pencil is amazing. I like to draw and it’s the best drawing experience I have experienced (and I have tried just about every tablet and stylus, including the Cintiq). For work the iPad Pro is 75% of a laptop replacement. It turns out for email, word process, presentations, spread sheets, messaging, and web browsing, I don’t need a complete desktop operating system in my lap–Spit View is enough. The screen is as book as my MacBook Air with higher resolution. Reading ebooks and PDFs is a pleasure. And watching movies and TV is like having a personal cinematic experience with surround sound at my hand. It’s simpler and feels faster than the Surface Pro or Chromebook. The Apple and Microsoft App for the iPad Pro work well. Byword, Coda, Procreate, Graphic, and Assembly are creative iPad Pro apps I recommend. I’ve never wanted to develop an iPad app before (iPhone was all that mattered to me as a dev).

Negatives: It’s big (but not heavy). I wish I could fold it in half. I want the keyboard to light up. I want a place to put the pen when I’ve not using it. Old iPad apps look ridiculous on the iPad Pro. Not all app support the split view feature. The Smart Keyboard doesn’t work well with developer websites like Cloud9 and CodePen. The Facebook iPad is stale. I’m afraid that charging the pencil in the iPad Pro’s power port will break it’s lightning connector off!

It’s a great time to be an Apple fan and an Apple developer. There are still plenty of problems with the Apple ecosystem. Apple News is slow and poorly designed. The App store has discovery, spam, and monetization problems. The Safari browser needs to catch up to Chrome. But Swift is the best programming language since SmallTalk and now opened source. So there’s that. It all evens out.

The Desktop Strikes Back

Darth vs Obi Wan

I was surprised and delighted by Microsoft’s introduction of the Surface Pro 4 and and Surface Book. I have a feeling that Microsoft is doing something really interesting: Bringing back the general purpose personal computer. Wait, wait, I know what you are thinking! It’s all about the phones and pads and the Internet of things! I get it! I’m not some old guy pining for the days when PC were king and 640K RAM was a luxury. Well, actually, I am that old guy. But I have not personally coded a desktop app, native or web, since 2010. Everything thing I do for work or play is meant for mobile devices. I’m usually the guy in the conference room saying “We need to focus on Mobile!” and “kids today don’t even know what a desktop is.”

But Microsoft and some of the recent changes to Mac OS X in El Capitan are making me think there is some life yet left in the PC.

While Apple is targeting coffee shop-consumers by making MacBooks  lighter but less powerful or targeting highly specialized markets with high-resolution workstations, Microsoft has reminded me that there is a vast middle in this market. And that middle is still mostly using desktops that run Windows. There hasn’t been growth in the middle for a while but then again there hasn’t been much product to spur growth.

Every year I want to buy a new phone. I swear have every iPhone model in a drawer starting with number 3. But buying a new computer is something I do only when I absolutely must. There just isn’t any reason to upgrade a contemporary desktop or laptop. And looking at where Apple and Dell and other PC manufacturers were going it seemed to me that PC were just getting specialized. The middle ground was a nomad’s land of crappy plastic slow PC encircled by ultra-lights and gaming rigs.

A while back I bought a Surface Pro 3 with it’s pen, keyboard cover, and Windows OS. I found it… interesting. A had to pair it with an Apple Wireless Bluetooth Keyboard to get a decent typing experience. And Windows 10 is still a little rough. Ok, Windows 10 is a lot rough. And confusing. But it getting better.

I feel a great nostalgia for all things from the original Bill Gates/Steve Jobs era. I will probably end up acquiring a Surface Pro 4 or a Surface Book. I’m pretty sure either of those products will not displace my iMac 5K as my go-to general purpose computer for coding, blogging, podcast editing, and cartooning. (Everything else I do, I do on my iPhone.)

But heck, I want Microsoft to win here and bring the PC back to the forefront of the consumer electronics revolution. So here are five suggestions or tips for MS that would have me running to the Microsoft Store as if they were selling Tesla Model Xs at a deep discount!

Tip 1: Really rethink Windows and the UX of a desktop operating system.

I know MS got in trouble for removing the Start Menu. But seriously: There is no Start Menu in Mac OS X or iOS because for the most part the whole operating system is the Start Menu. Go back and look at the Xerox Star if you have to. Don’t try to mask complexity with a handful of easy-to-use screens hiding the real OS. When I worked at Apple we had a saying: “Every pixel counts.” It’s clear to me that on Windows some pixels count more than others.

Tip 2: Bring back desk accessories

I know that both Apple and Microsoft have failed at providing consumers with a library of little single-purpose applets that share the desktop with the bigger multipurpose applications. But, as guy who once wrote a mildly popular Yahoo Widget, there is real consumer value in DAs. I think the original Mac OS and PC DOS got it right: Apple’s Desk Accessories and Borland’s Sidekick provided little utility functions that were easy to access, simple to use, and fast to summon and hide. By contrast Apple’s Dashboard Widgets and Microsoft’s Desktop Gadgets were slow and clunky. These decedents of the desk accessory were too ambitious and missed the whole point. I want “info at my finger tips.”

Tip 3: Fix the menu bar or retire it

I was so excited when Mac OS X El Capitan enabled me to hide not only the taskbar but the menu bar as well. I hate the menu bar! It’s usually a dumping ground for every feature of an app randomly arranged. Long ago the menu bar had a formal structure. It was drilled into my head as a young software developer that menu titles were nouns and menu bar items were verbs. If I had a document menu then all the menu items were the operations that could be performed on documents. But right from the get-go both Apple and Microsoft ignored that simple and powerful idea. Almost all Windows and Mac apps have separate “File” and “Document” menus. I know that files are those objects that computer applications store data into but we tell consumers to call those things documents. Everyone is confused. And then there is the universal “Edit” menu which should be called the “Selection” menu. This might seem like small potatoes but I’ve learned trivial details are the stumbling blocks that kill product adoption.

Tip 4: Make the desktop a first class entity

Most flavors of Unix are doing the Desktop right and Apple and Microsoft are starting to get clued in. It should be very easy to set up and arrange windows on a desktop and have them stay that way for eternity. Like really forever and definitely between restarts and system updates. Adobe understands this and gives each of its apps a layout manager that allows artists to personalize and save their workspace. Context is everything. Humans are dumber in unfamiliar contexts and smarter in well known contexts. A desktop is really just a context of virtual objects. I think phones are easier to use, not because they are better designed than PCs, but because they naturally just have one context, one screen, at a time.

Tip 5: A list of five more tips

Bonus round!

  1. Don’t go too far trying to make the desktop UX the same as the mobile UX. They are two different use cases. Shortcut keys, content menus, and over lapping windows are great features and can’t really be replaced by gestures, hard presses, and split screens.
  2. Bring back BASIC or Hypercard or some kind of programming environment intelligent non-computer scientists can utilize to create real apps on their own. It’s not about workflow automation. Do not copy Apple’s lame Automator or evil AppleScript.
  3. Clean up your Windows Store. Be even more picky than Apple. Keep out the spam, copy cats, and useless garbage. But make sure users can continue to download and install non-certified apps. I know it’s risky but it’s also capitalism.
  4. Reactivate Windows third party developer base, not by enabling quick and dirty ports of websites into Windows apps but by continuing to empower and simplify and open Visual Studio. I went to one of the very first Windows developer events in Redmond in the early 90s. I got to shake Bill’s hand. I’m sure he doesn’t remember me but I really wanted to write Windows apps after that.
  5. Continue to revive and refine the general purpose personal computer that is great for everything and works for everybody. I don’t want or need a workstation. I do want to get a lot of work done. Instead of thinking like Apple, think like the Microsoft that re-packaged and made affordable the hoity toity graphical user interface in an open system for schools, small businesses, and nerdy kids.

Even if Microsoft succeeds with the Surface Pro 4 and Surface Book, the PC market will most likely continue to look to Cupertino and Redmond steal marketshare from each other. But unlike smart phones, pads, and household items with embedded microchips, PCs are programable–by users. And that is something worthy of a battle with the Empire.

When Dogfooding Fails

For over 20 years we’ve been eating our own dog food in the software industry and it’s not working. For the uninitiated dogfooding means to actually use the product you’re developing. It started out as a radical idea at Microsoft and spread as a way to get developers to experience their customer’s pain. On the surface it was a very good idea–especially for an aging corporate culture divorced from its users. When I interviewed with Microsoft in 1995 I was told that all their engineers we’re given low-end 386 PCs. These PCs ran Windows 95 so slowly that the MS developers were incentivized to improve Windows’ performance to ease their own suffering. I don’t know about you, but I find that Windows is still pretty slow even in 2011 running on a really fast multicore PC. Clearly all this dogfooding is not helping.

So I’d like to frame an argument against dogfooding and in favor of something else: Plagiarism.

My argument goes like this:

  1. Dogfooding doesn’t work, or at least it’s not sufficient, because it’s not a good predictor of software success. Some software that is dogfooded is very successful. Most software that is dogfooded fails. (Most software fails and most software is dogfooded therefore dogfooding fails.)
  2. Dogfooding is really bad because it give you a false sense of doing something to improve your product: “It’s OK, I know our software is terrible but we’re forcing our employees to dogfood it and out of shear frustration they will make things better! Everyone go back to sleep…”
  3. Dogfooding reinforces bad product design. Human beings are highly adaptable (and last time I looked software devs are still considered human). We get used to things, especially in a culture where company pride and team spirit are valued (e.g. groupthink). Over time poor performance becomes typical performance. It starts to feel natural. Thus slow loading Windows operating systems become the gold standard for thousands of loyal Microsoft employees and customers. Instead of fixing the software we are fixed by it.

I believe that the urge to Dogfood is an emergent strategy of mature tech companies that want to rejuvenate their software development process. Management starts talking about Dogfooding when they realize the spark of creativity has gone out and they want to reignite it.

One of the reasons Dogfooding fails is that you never eat your own dog food in the beginning: The dog food didn’t exist yet. You had to get your inspiration from outside the company. Microsoft Windows was not the first OS with a graphical mouse-driven shell. At some point the Windows devs must have looked at the Apple Lisa and Macintosh computers for inspiration. And the Apple devs looked at the Xerox Star. And the Xerox devs drew their inspiration from the physical world: The first GUI desktop was modeled on an actual physical desktop. No dog food there.

So rather than dogfooding we should talking about plagiarism. If you want to make a great software product eat another a great software product and make it taste better–don’t eat yucky dog food.

Microsoft should force their devs to use the fastest computers running the best operating systems with really cool applications. I think they must have bought some MacBook Airs, installed Ubuntu and Spotify because Windows 8 looks pretty awesome 🙂

I Bought A New MacBook Pro and Didn’t Pay an Arm and Leg!

Apple had a sale over the Thanksgiving weekend. The savings we’re exactly in Crazy Edie territory but $101 off a new MacBook Pro just about covers the tax (in NJ). My last MBP has been sitting in pieces on the bookshelf behind my desk at home. I bought it in 2008 and two years of daily commuting between NJ and NYC literally shook it apart. I used Apple’s sale as the thin, poorly veiled, excuse to buy a new MBP. The truth is I’m just addicted to shiny new computers and I had to feed the monster.

When it comes to buying a computer I have three criteria:

  1. Don’t buy something that will become obsolete in a quarter.
  2. Don’t buy less or more power than I need.
  3. Pay as little as possible while still buying something that won’t embarrass me in front on the cool kids.

When I met my wife she explained to me that you can tell a lot about a person by their shoes. A cool hip guy might walk around in an outfit from Target but the brand of his shoes will tell you if he is being ironic or a showoff or a cheapskate. In the 21st century you can apply the same criteria to computer laptops. Some guys (or gals) buy the most expensive luxury desktop replacement money can buy as if to say: “I’m bad!” Other guys buy the cheapest under powered plastic toy “puter” that buy.com has on sale as to say: “I make Scrooge McDuck look like Bill Gates! (The current Bill Gates not the earlier one who acted a lot like Scrooge McDuck before he got married.) Then there are understated nerds like me who try to say something nuanced with their laptops: “Yes it’s not the fastest, but we know that RAM and HD speed are more important than raw CPU speed for real world applications.”

After much research and discussion with my hardware otakus this is what I bought and why:

I bought a 15″ MacBook Pro with a 2.40 GHz Intel i5 core CPU with 320 GB hard disk and 4 GB of RAM. This is the least expensive 15″ model Apple sells at $1799. I asked Apple for one extra: A higher resolution LCD display (1680 x 1050 instead of 1440 x 900) at only $100 more. With the Apple sale I got the hires screen for free but at only $100 for 30% more pixels it’s a bargain–one of the few true steals to be found in the Apple Store.

The display resolution is why I bought the 15″ and not the 13″. More pixels means less scrolling and more productivity. But I could have bought the 17″ MBP with a whopping 1920 x 1200 screen resolution. But I’ve used the 17″ model before and it’s not really portable. As a hard core northeastern corridor commuter I need something that fits into a standard backpack, weighs less then a 3KG medicine ball, and actually fits on my lap in the crowded train car.

Apple has options for much more powerful (i7 core) and faster (2.8 GHz) CPUs. But while benchmark software will show you a 25% to 30% performance boost between the 2.40 GHz i5 and the 2.80 GHz i7 pure CPU speed isn’t the problem unless you’ve unclogged all the other performance bottlenecks in your laptop.

The the real roadblocks to a laptop snappiness are memory and storage speed and size. Modern operating systems accommodate today’s bloated software applications by organizing memory usage into “pages” and swapping these pages in and out of disk as needed. Adobe Photoshop is the exemplar: It can’t let you edit that 21.1 megapixel image without shuffling pages of memory around. Some operations, like filters, are CPU intensive, but most operations (reading, writing, zooming, scrolling, copying, pasting, …) are memory bound.

To lessen the bound of memory I ordered a 4 GB ram stick and a 7200 RPM 500 GB hard disk from a third party: Not Apple! Apple charges extraordinarily high prices for RAM and hard disk upgrades. To buff up my MBP would have cost an additional $550. The third party RAM and HD only cost me $154.31 and 1/2 hour to unscrew the back of the MBP and install everything. In the end I had a sweet new MBP with 6 GB of RAM and 1/2 a terabyte of storage. Photoshop is happy.

There is a risk that by upgrading you’re Mac you’ll ruin it and void the warrantee to boot. I alway get help from my hardware friends who show me how. There are also some good videos from MacSales that we’re really helpful. The voiding of the warrantee went from a definite yes to a maybe in the last few years. Apple reserves the right to blame your MacBook problems on you if you don’t use an authorized service provider.

For me, it was worth the Geek Cred to personally upgrade my MBP so I could have a great ice breaker at Starbucks:

“Oh, this that new MBP you got there?”

“Yes, but I haved $400 bucks by upgrading it myself and I got the hires screen for free on Black Friday.”

“OMG! 2G2BT! CSA!”

Android SDK Compatibility with Eclipse and JDK

I recently switched my development workstation from a MacBook Pro to a Windows desktop PC. Yeah, I know, I’m going against the trends but it’s a sweet machine I assembled myself based on recommendations from Ash.

Immediately I ran into compatibility problems with Google’s Android SDK and the current versions of Eclipse (Helios) and the Java Developer Kit (JDK Version 6). In a nutshell Google’s cool Android dev tools don’t work with Helios–you need to install Eclipse 3.5 (Galileo). Galileo require’s JDK Version 5. All this info is prominently featured on the Android system reqs page–but who reads any more?

Digging up old versions of Eclipse is easy. You can find Galileo here.

Digging up old versions of the JDK is a bureaucratic nightmare. You can find JDK Version 5 here but to install it you have to fill out a form, give away PII, and then wait for an email.

One way around Sun Oracle’s walled garden is to install Open Office 3.2.1 which installs Java 1.6 (JDK Version 6) in such away that everything compiles.

Now that Google is throwing away all their Windows PC’s I’m sure this compatibility nonsense will get even worse. Here is a note from Google about enabling debugging of Android Phones:

If you develop on Mac OS X or Linux, you do not need a special driver to debug your application on an Android-powered device.

Damn it! I might have to go back to coding on the Mac and only using my PC for trival tasks like gaming and web browsing. Ironic huh?