Tuesday, December 18, 2012

What's the problem? Is it me?


I'm trying, against considerable odds, to think of words that aren't trite in the face of recent events. I want to address issues more relevant than which phone, TV, tablet to purchase/recommend, even though, and in spite of the tools in my kit, I'm now uncomfortable with all of them. A nagging conscience suggests that broader issues prevail, that I may be part the problem and not the solution.

Today's tech is powerful, some of it even useful. But whether by the unlikely Zombie apocalypse or the obvious effect of climate change, choosing a gift might also include the choice of nothing at all. Not an easy sell to kids who might cry at their relative depravation (all the other kids have one), and nerds (like me) who would express similar frustration with adult-like rationalization (all the other adults have one).

Every device, from stone wheel to large screen TV, requires manufacturing, and destructive, blind consumption of it might portend a dark, dystopian future. I understand, and am also weary of, environmental diminution discussions, because such talk is truly annoying.

Arguments of conscience might actually cause opting out of a new TV for me, iPad for a grandparent or a new vehicle even before the tires wear out. At issue is whether making and disposing of today's tech product is robbing the future to feed the present. While I do believe that we're way past an environmental tipping point, it's also possible that future technology can save us from ourselves. Game on; will an apparent race to the bottom destroy the ability to create cures at a rate that exceeds the consequences?

Like the pain of a broken arm, cognitive dissidence is never pleasant, but hiding behind indifference, or short term reward, isn't appropriate either. If I can afford an iPad, if I've worked hard (or not) and/or have a credit card, isn't said purchase my constitutional right? Don't we live in a free country? Who are these pesky environmentalists that harp endlessly about coral reefs, tainted water and drought? Why should they have a higher argumentative temperature than my own burning wallet?

Perhaps we live in a time of irreconcilable forces. The world economy, such as it is, is said to depend on growth through consumption. We're told that refrigerators, cars, computers, tablets, all manner of gadgets must be renewed, whether we need them or not, and failure to participate will cause manufacturing to stop, jobs will disappear. Sustenance is old fashioned and painful, children can't compete if not connected, twenty thousand music tracks/uploaded photos and constant, always-on communication are required minimums, social covenants, important.

Is all this connectivity really necessary? To quote Steve Martin (twitter), "This is weird, but I’ve found I can read someone’s mind just by talking to them for only an hour or two."

When I was a child we often visited the Spring Lake cabin of a family friend. He kept a station wagon that was used only for transport from the cabin door to the boat dock, about 200 feet. His generation was the Great Generation, born into the depression, then conscripted to fight the second world war. Afterwards these brave soldiers were renewed, they lived in the first robust economy of that epoch, a free market that encouraged consumption and indulgence. These Mad Men weren't consciously destructive, they were merely pursuing the American dream full stop. Even though the station wagon evaporated more gasoline that it burned, and trips to the gas station were infrequent, it was most certainly an unnecessary use of resources. Hidden behind the familiar Ford logo were oil, steel, blood, electricity and other resources necessary to support their relative luxuries.

Even though I know better, I've followed their example like a little duck. My own closet overflows with gadget excess, so it would seem that I haven't really learned a thing. The long view of it makes me cringe. I try to recycle, but it's well known that these programs have a marginal result, that significant, harmful elements of their construction can't be reclaimed by science or alchemy. Much of the gadget waste we produce will end up in third world landfills, picked over by scavenging children and/or dismembered by rain and sun into their component parts, then sent downstream.

Humans are thought to be a superior, evolved species with unmatched survival tools. Brains allow construction of the world's highest building in ninety days, single submarines that can transform the entire planet to dust, technology that records everything, from purchasing habits to heartbeats. Brains allowed an otherwise weak and inferior species (that would be us) to dominate bears, mountains, oceans, mosquitoes and apparently inferior peer civilizations.

Hands are the brain's tool, fantastic strength multipliers that fabricated arrows, steam, super computers, art, bombs, Bushmaster semiautomatic rifles, yadayada. While pocket sized gadgets or sense numbing entertainment centers would seem to be of lesser import than conquering the world or killing children, I wonder; should the dark, destructive results of gadget design be given any less consideration than fossil fuel consumption, over-the-counter assault weapon sales or uncontrollable greed? Do we live in the epoch of ballyhooed industrial revolution that yields a paradoxical result, wherein the devices that claim extraordinary promise might actually destroys us?

I hope not, I love this stuff, I do the best I can.

Or do I? Please stay tuned...

Wednesday, December 5, 2012

Windows 8? Disruptive Lipstick on a Stable pig...

I don't want to be a Christmas stealing Grinch, but because it’s an election year the hype-o-meter began to peg last summer and it’s stressing me out. Silly me, I worry about value, I consider whether products are the best they could be instead of what they are. The average consumer may not notice a quality deficit because of the well-known disease acronym SSF (seasonal feeding frenzy), but I hope expression of my mildly damning opinion might be useful nonetheless.

Before you purchase please consider business plans of these major players:

Per the previous blog post (Meh!), Apple is a hardware company so it must roll out new devices to maintain meteoric capitalization gains. In the past Apple's various laptop and desktop offerings weren't designed as throwaway, in fact there are many older iMacs, towers, even older Mac laptops that are regularly renewed and kept in service. But this isn’t optimal for Apple's business model so its devices are now sold with the expectation of planned obsolescence.

Google (Android/Chrome) is an advertising company, great at collaboration and data mining, but their UI (User Interface) department was originally staffed by old remote control designers (getting better but still crude). They throw stuff at walls, hope for adhesion and some of it sticks (watch the Chromebook, I think it has real potential, very inexpensive). Google's current and future business model is to make people use the web, and they've already amassed a fortune through search. They dabble in software and devices, but it’s all in service of advertising click-through.

Microsoft, is a software company. To maintain profitability (but not increased capitalization, its stock price hasn't really moved since the discovery of fire) Microsoft must sell software. And now we have the next iteration of its operating system, Windows 8.

Short term corporate guidance for consumer technology requires advertising departments concentrate on the next black Friday, while long term earnings (aka the long tail) gradually decline. Several years may seem to be quick turnover, but it’s actually epochal. The 'net, and its corresponding uptick in consumer use, spans significantly less than twenty years for most users. Therefore, an OS (operating system) life of 10 years, or even 5, is huge.

Microsoft, unlike our confused and dysfunctional congress, sort of plays for the short and long term simultaneously. With Windows 8 Microsoft chose not to make a traditional office user appeal (read mouse/keyboard). Rather they've made a major shift, analyzing today's younger users and concluding that they don't give a hoot about anyone's desktop. And why should they? Their lives can be managed, archived (visually, at least), shared and minimally annotated in only a smartphone, so why should/would they bother with a content creation thingy, collaboration, or a winning corporate strategy? Touch is the new mouse, swipe the new UI (user interface), instant, capricious sharing the new privacy. So this currently immature demographic is rightly projected to shape the long tail of business, even though mobile habits don’t upscale well (yet).

Make no mistake, under the hood, Windows 8 has some clear advantages over the current, most used OS, Windows 7. Many of those improvements are devised to mitigate Microsoft's reputation as a virus prone, complicated operating system that shouldn't be gifted to a grandmother. It also shifts user focus to the cloud.

Grammy might be a good candidate for Windows if the platform was intuitive, but it's not. Windows 7 has the best Microsoft UI ever, but has no natural parallel, it's rather an acquired taste. For instance, how might Grammy intuit whether to click twice or once and on what, right click, or drop menus to make complicated command choices (what does save as actually mean?). Credit Steve Jobs, he moved computing into the next big user interface (iPad, iPhone, iPod Touch), wherein much of how people interact with computational devices can be divined by a three year old in five minutes.

Moreover, average consumers don't really need Word and Excel, they want to listen to music, watch YouTube, use Facebook, messaging and Skype, read/comment on Twitter and old-fashioned email (decreasing). While it was once true that a classic, complicated computer was necessary to accomplish basic tasks, it's no longer necessary, so the marketplace is hatching a new species.

Many computers as we've known them are wasting away, some dying in in closets, many more retire to filthy landfills, others are renewable upgrade candidates. Most of their owners grew up with mouse moves and keyboard shortcuts so Microsoft could have continued Windows 7 until Windows 8 was fully baked. After all, Windows 7 is an excellent operating system if properly managed (it often isn’t), but sales projections for an already saturated marketplace won't support outrageous executive salaries and operating/development costs. Notwithstanding those customers looking for an inexpensive computer experience, like cash strapped parents with kids in school, ‘tis the season to be spending. Pandering to the seasonal frenzy, Microsoft forces (on new computers) an immature release, claiming it to be an improvement over 7, so customers should just get over it. Apple releases anything and blinded fanpeople fall over themselves to get in line. But Windows 8 is not causing much of a stir, except for controversy. Good riddance. Darwinian.

Windows 8 is a mobile-centric UI trying to hop the desktop barrier. If Microsoft had only released Windows 8 for mobile only (including phones, tablets, touch-enabled laptops) it might have been a reasonable hit, and if price competitive even a runaway hit. Instead Microsoft looked five(?) years ahead and decided that today's business model (woo consumers but cater to business) is running on fumes, so any move is better than no move at all (perhaps true).

Now if aforementioned Grammy (or any user) can suffer the Windows 8 learning curve on a non-touch device she'll be in less danger than with XP or even 7. Viruses have a harder time getting a grip inside this new OS, applications of questionable lineage are often rejected and any Metro application (like Weather, Calendar, People, etc.), when not on screen, is actually taken out of active memory, much like Apple's iOS handles the issue of multitasking (not multitasking really, it's just taking a snapshot of the application's state and freezing it for future reference). This all combines to make a more stable, less power hungry operating system, but I don’t think mobile 8 will unseat iPad as the runaway winner of the tablet wars this season. Windows 8 laptop/tablet hardware isn’t mature, so devices on which the MS version of intuitive(?) user interface must reside, except mobile phones, are not ready.

So I conclude that Windows 8, when used on any device without touch, is disruptive lipstick applied to a stable pig (Windows 7). When used on a hybrid tablet (touch, keyboard and mouse enabled) the experience does get better.

The RT version of Windows 8 is a pure mobile play, distributed on devices like the Surface RT. It's important to note that regular Windows applications won't work at all on RT; they must be specifically designed for that platform and developers are only nibbling at the bait. RT is designed for devices with limited horsepower but doesn't yet satisfy the needs of traditional Windows users, probably because it straddles the fence between desktop and mobile. Baring some surprising uptick in device improvement (battery life, value) I'm going to wait before purchasing a dedicated, touch-screen Windows 8 device. Hopefully Microsoft will learn the art of successful iteration, read design shift, so stay tuned for Windows Blue.

But for now, if you're purchasing a new Windows 8 computer and it doesn’t have touch, I say fugetaboutit. If you're trying to minimize the financial hit of a new computer and there is a Windows 7 option you'll find some great bargains out there, particularly in laptops. Plus, 'tis not the season for deferring purchases, the American economy seems to depend on it (pathetic). However, if you’re considering an OS upgrade to old hardware, Windows 8 is the least expensive upgrade yet. I've done it on two conventional computers, one desktop with multiple monitors (an upgrade from Windows 7, painless) and one laptop (my Macbook Air). I’m an experienced Windows user and running Windows 8 is a pain, but more like a nagging toothache than a debilitating sickness, your mileage may vary.

Next up: I don't know. Writing about technology is a dish best served quickly.

Tuesday, November 27, 2012

Meh...

This time of year I'm asked to recommend technology. One might think that a plethora of choice might be construed as plethora of technical innovation, but I submit that the hype is greater than reality (shocker!). I'm going to explain my bias in several, platform-centric installments, rather than wax adoringly about any of them, because the manufacturers already know how to brag. So (spoiler)... bah, humbug and meh.

Like many mortals, I can't afford to go all in with any manufacturer's version of we're the best for everyone hype, particularly Apple (arguably the most expensive). Instead I must pick/choose a few from each type and infer the rest. I own several non-tablet computers (at least 6) running Windows operating systems, but none with touch screens, making Windows 8 a tough sell (more in the next blog post). I also own several Mac devices, from the Mac Air that is my daily toss about workhorse, an iPod Touch (4G), a Mac Mini (a media center device) and I just gave an iPad (the only one for now) to my daughter (I'm over it and I miss it, will probably go for Mini soon). My phone runs Android and I recently purchased a Nexus 7 tablet because I thought the Android ecosystem might merit a hike through its woods. Unlike most users I must maintain some Windows interoperability with all of them because my day job (IT) is supporting Windows users.


Tech consumers live in an evolving ecosystem, wherein mutants are hatched, then evolve, may the best mutant win by a nose. But therein one finds a sad reality, that is, all the major platforms are attempting to outdo the others by providing eye candy over substance. In Apple's case, a gated ecosystem substitutes for apparent simplicity. 


It may take more than a few iterations before clear winners emerge in the traditional laptop/desktop marketplace. Even then I suspect, hope, that a decisive winner is not on today's radar, because the current crop of stuff excites me less than canned beans (with the exception of mobile, the real hot marketplace).


It's easy to get caught up in the new, next best thing. For many, including me, there exists a fascination with a learning the new gadget, assuming no noticeable diminution of stability, safety and base bling from the prior model or type. I found iPad to be a perfect example; a marvelous machine with the sweetness of candy, it invites all manner of tapping, swiping and gesturing, raising a hope that some under-the-hood innovation will just pop out and cause intellectual orgasm. But I crave more than candy, I crave the main course, and iPad didn't deliver (yet).


If you gift an iPad no doubt the giftee will be delighted, particularly cats, children and web consumers. But any serious computer user will want more functionality, a major burn regarding Apple iOS (the operating system for iPads/iPhones, some iPads); Apple dumbs iOS down to enforce arbitrary limitations that aren't defined by hardware/software at all. That's how Apple avoids competing with itself, as giddy, Apple fanpeople will purchase at least one of every Apple device that rolls off the dehumanizing Chinese assembly line (to be fair, most manufacturers have some thumb in the unacceptable Chinese labor pie). I justified the purchase of iPad because I needed a long-lived writing tool; portable, light weight, reasonable screen size, stable. I didn't expect that the iPad/iOS experience would slam the door in my face early on because it seemed, on the surface, an almost perfect device (with the addition of an external keyboard). Not to be outdone by a machine, I painfully worked around the most aggravating aspects, but not with Apple's help; it was third party hardware/software that almost saved my bacon.


Even the Mac Air, arguably the dreamiest piece of hardware ever devised, has lousy battery life, less than half of the iPad, and is bound by a cloistered and restricted ecosystem. Per a comment by Andy Ihnatko of The Chicago Sun Times, Apple innovates at the launch of a product, then iterates slowly. So don't purchase an Apple device and expect that dramatic innovation will be inserted over time.


Which brings up another point. Modern Apple products are becoming tomorrow's trash much faster than their predecessors because chip, screen and battery design can't keep up with an annual shopping season. Apple has sacrificed longevity for appeal and owner turnover, so I predict that my Air might replace several coffee table coasters in about three years. Internal parts, like the SSD (or Solid State Drive, a non-mechanical storage device that replaces the hard drive) and battery are designed for a limited run. So unless your pocket protector is stuffed with proprietary tools and you're willing to pry apart cases/unglue components, the most expensive, arguably most attractive computing platforms (so far) are scheduled to die at the same broad moment. AppleCare, the premium Apple warranty, lasts two years (duh!), standard warranty only one. The battery might even go before that if repeatedly drained beyond the 50% level, but even if carefully cared for lithium polymer technology is nominally restricted by chemistry to about three years. A very expensive (disposable?) device that might not make it through an undergraduate degree. And so, BTW, iPad follows in that path. Not that volatility exists only within the Apple domain, but a Blade Runner-type, replicant death schedule is not enlightened design.


Great machines designed in an atmosphere of self-protection and arrogance; Apple profits from hardware sales (unlike the competition) so hardware must turn over for Apple to continue its meteoric climb. Steve Jobs didn't listen to users, he was a genius and benevolent dictator.


'Nuf said for now, perhaps Tim Cook will open this up.


Next up, several months trying to live with Windows 8.

Monday, November 19, 2012

Help! A Bot Elected the President…

It's now common knowledge, at least in the geek community, that Team Obama had data, knew how to execute, and Team Romney didn't have a clue. If that playing field alone had been equal I'm not certain that the election would have had the same result. So I humbly submit a premise: Carefully executed technology decided one of the world's most important offices, rather than (and in spite of all the post-election punditry) ideology, gaffs and advertising. If Team Romney had placed any importance on data as a mechanism to drive results, might the outcome have been different?

The Romney campaign, and even their super pacs, had no respect for 2012 technology. Team Obama otherwise constructed an effective, data driven campaign, built with modern tools by brilliant young minds. I watched Team Obama work their software magic almost every day for several months, not as a participant, but rather as voyeur. Team Obama boots (on the ground) often drank coffee (when they could afford it) at my favorite watering hole, where I could peek at their screens and deduce activity. In the final two-three months, their software, pulling from many sources, identified blocks of voters who would be crucial to the count, initiated personal calls to those voters through any campaign worker's cell, and, one by one, offered their version of compelling argument. While Team Romney was firing at imagined targets with BB guns, Team Obama used a laser-accurate, weapons grade recommendation engine. I propose that this raises a question that will linger, that is, if data mining (this cycle) might have been the most important decision element, then who, or what, are we electing when casting a ballot?

Call me old fashioned. I would rather make election choices on the basis of candidate competency and values instead of underlying software genius, even though I depend on trends/memes for insight. Democrats were clearly betting with a full house, while Republicans held two deuces and stuck with a losing hand.

Is this enough to infer other competencies? Perhaps it actually is, or, at the very least, it's possible to judge who might more effectively leverage new tools if handed the keys to a potentially devastating candy store. While I abhor innocent children being decimated by drone bomb droppings, and the incredible technology that drives them, I don't think that less technical competency would cause a shift to humane diplomacy. Similarly, I often consider and argue that China's bureaucracy is populated with technocrats, while US governing bodies are populated with lawyers, and China seems to be winning the race for world dominance, much the same as Amazon outsells and eventually smacks down the neighborhood bookstore. I don't necessarily agree with these outcomes, but I can't ignore them.

Can we change our election strategy to accommodate less algorithm and more substance? Dunno, the horse seems to be out of the barn, just as the world is inextricably connected to automated data trading, commerce, even push button (automated?) war and/or revolution. I do posit that a younger, technically savvy Republican won't make make the same Team Romney data analysis mistakes, although bold, new mistakes are always a possibility.

Or, we could go all in on contemporary, if frivolous, technology: The next presidential election would cost a hellova lot less if we just analyzed tweets instead of spending all this time and money trying to make informed decisions. May the most effective Twitter analysis win?