A July 13 New York Times article, “That’s No Phone. That’s My Tracker,” by Peter Maass, suggests that we should consider smartphones, computers, and other connected devices as tracking machines rather than appliances of personal convenience.
The manufacturers of these now ubiquitous gadgets claim that aggregating data about individuals favors the consumer, so when you visit a web page, it might display ads relevant to your tastes and needs. But it’s widely speculated that far more sinister use is made of this information—that the government enjoys a cozy relationship with the private data gatherers, that information can and will be used against us, and/or to the advantage of the military-industrial complex.
I assume that Amazon and the NSA know a truckload of tangential information about me; that is, who I befriend and communicate with, the web pages I visit, where I am and where I’ve been, the stores at which I spend money and the items purchased therein. But they’re still missing the most important component; that is, who am I, what do I intend to do? If I purchase fertilizer, am I making a bomb or helping my crops? If I purchase boots with deep heels and correspond with persons with Arabic names, am I a terrorist?
Let’s say my eldest granddaughter, April Rose, joins the Peace Corps. April is already an accomplished farmer, so she travels to Africa, proselytizing for sustainable agriculture. While there, April befriends local persons her same approximate age; many are illiterate, and relatively unsophisticated in Western terms.
Some are reluctant rebels who, as women, can face recrimination by death, rape, stoning, starvation. They live in a pressure cooker of male dominance, and without future. During informal conversations, April argues against violence or suicide so she should be judged as a worthy asset by the US government; a loyal American ambassador attempting to win hearts and minds, spreading the gospel of peace.
However, a government, any government, decides that April might be a threat, not because they really know anything about her moral compass, but because her spreadsheet calculates a suspicious result. In an over-simplified attempt to decide whether she’s enemy or friend, the government decides to intercept her conversations as best they can.
Problem is, April is technically savvy and suspects her friends are being scrutinized by evil forces everywhere. In all communication she uses various encrypted (read: privacy protected) mechanisms so strong that even the NSA probably can’t decipher all the content. Contemporary computer science suggests they know her physical circumstance (accurately) and might be deciphering trigger words plucked from various data streams (inaccurately/partially), including encrypted voice (Skype). We really can’t grasp the extent of NSA capability, but when the mechanisms of encryption are cracked, we’ll be the last to know.
In this equation, consider the deployment of weaponized drones in all shapes, sizes and capability which will soon outstrip the presumed benefit of human intervention. When a young drone pilot is incapacitated because of a hangover, or a quota is missed, or the master target map spikes beyond capacity, Gen X flips a switch and allows the robots to think for themselves, or even as an interconnected hive. It’s incorrect to assume that algorithm will spare life because of a compassionate sub-routine (read Kill Decision by Daniel Suarez, wherein sinister, deployable math, is modeled on the activity of aggressive insects).
April’s life ledger might not meet any reasonable standard of proof, but the surreptitiously obtained and commercially available information is fed to a target probability list when sufficiently elevated by algorithm. Let’s add it up. She lives in an African nation where dangerous persons are known to exist, she even shares meals with them. She arranges the purchase of fertilizer and, most importantly, she continues frequent conversations with her multi-national friends after leaving Africa.
April’s life and death spreadsheet; simplified, incomplete, misleading.
Although April aims to convert her new social group to peaceful purposes, the aggregated content of her life is, at best, 50 per cent decipherable (looks for keywords, like bomb, or jihad, analyzes compression algorithms for probable content, etc.). After leaving Africa, April continues to contact them from all over the globe. April’s life and death spreadsheet; simplified, incomplete, and misleading; beta.
If in the US, she might be tracked, harassed, or arrested. If still in Africa, she and her friends are vaporized by an invisible, silent drone launched from thousands of miles away designed to remove her by automated, tangential analysis.
Think this is fanciful or improbable? Ask a member of a wedding party in Afghanistan or Yemen that was just blown to bits by a drone operator in Colorado Springs right before she left duty in time for Happy Hour at the local saloon. And, what happens when Iran, Russia, China, Syria, gangs, thugs, mafia, and/or other outliers deploy similar, competing technology?
Don’t blink, it’s happening already, right under our noses.
We live in a world ruled by government and a gaggle of omnipotent corporations making fundamental, serious judgments about our fellow humans by inadequate proxy. They have the capability to gather enough information about us to sell products or kill ten thousand miles away, but we have no direct knowledge. We are in their crosshairs.
Nonetheless, when the numbers dictate, they jail us, take our money, make our lives miserable, foreclose our homes, or, if they think the numbers dictate, extreme action, and a life is gone here or anywhere in the world.
Whose granddaughter, nation, political movement will be next? The enemy algorithm doesn’t wait on sufficiently robust technology, rather it turns bits to bombs when the software is deemed good enough. Popular opinion and an endless stream of apocalyptic news forces/allows aggression without proof. Governments and their corporate overlords jail/kill with circumstantial evidence, incredibly flawed human observation, and no moral mandate.
Death by spreadsheet.
(Click Here to read Part One)
There was an epoch wherein the pursuit of a post K-12 education, or even a high school degree, might be career related. I'm old, so I remember the Board of Education throwing my fate to hired guns (aka counselors) who claimed to understand how humans will travel through time and what *my* personal vehicle should be.
They determined my aptitudes through testing, scholastic achievement (none), whether or not we were cheerleaders, athletes, etc. With a loud, resounding kathunk, they attempted to staple our futures to a standard form, then we were off to the armed services, trade school/associate degree or college. Back in those days, and notwithstanding the price of even the most prestigious institutions, even a self-resourced college education was possible, even meaningful.
This was a temporary situation, even if it lasted 50(?) or more years. Robotics and extra-national capitalism eventually ate (and will continue to devour) manufacturing jobs for lunch while inflation increases faster than wages. And because the apparent unfettered distribution of wealth falls heavily in favor of those who *own* the robots, we live in a nation littered with underemployed, unemployed and untold millions who just gave up.
Government and entrenched power pretends that American exceptionalism will continue, as if by magic, as if power hasn't been protecting it's own interests, as if government hasn't been squandering limited resources on guns instead of butter. Government and conscience-constrained, conscience-challenged corporations continue to *just say no to the middle class.*
To further complicate and/or suggest the necessity for a momentum shift, I propose that futurework will require indefinable skills (although, most certainly, to include high technical and mathematics acumen), set in an interminable future. With the possible exception of some trades, a new career term will be only a few years, not a lifetime. So if teachers intend to imbue students with skills necessary to survive in the new competitive work environment, are teachers willing to drink some of their own cool-aid, that is, re-tool their teaching habits when the need arises?
I humbly suggest that the *adults* who shape our education future consider this; many may guess, but no person alive actually knows, what computer and/or life and/or career skills will look like in five years, much less 20. But if trends portend the future, here are a few possibilities to consider before foisting old technology (like chalk, syllabus lectures and desktop computers) on today's students.
From the Harvard Business Review:
In a recent study, the Bureau of Labor statistics found that the average person... held eleven jobs between the ages of 18 and 46 — meaning a job-switch once every 2.5 years.
From my own Befeebled Brain...
Tablet and Smartphones are already the New Computer (at least for now, wearable computing is close): The move to tablets is not a race to the bottom, but rather a lateral shift to accommodate realistic user need. I applaud the shift, even though I resent the limited, useful life of my expensive, environmentally unfriendly computational hardware, currently about two-three years.
Touch, Voice, Eye Tracking, even *Thoughts* are the New Keyboard:
This might not be such a bad trend, except that (at best) the marginal keyboards on mobile devices have contributed to a new illiteracy, that is, substituting thoughtful, written text with IM slang, video and snapshots. While it's true that pictures can convey abstracted meaning, they're not a full time substitute for specificity or literacy. Which leads me to…
Voice and Pictures are the new Communications Mode:
Watch teenagers, they will not only be the first to use new modes, they might invent/re-purpose the new modes, Snapchat (new) and Instagram (supposedly trailing) being great examples. 2013 probably will not see the emergence of a more intelligent youth demographic. However, today's 3rd grader will have been raised from an early age with connected awareness, conventions and rapid change of technology driven reality. Watch out old folks, *these kids will rule.* Per the chatter of prior generations, "Be there or be square."
Next up: A curated, updated link garden to my favorite education discussions.
Last summer I intended to publish a series of articles on American education, which I presumed had something to do with training young people to participate in futurework. I was on a roll, but only two of several articles had been published when the series was interrupted by the KDE site makeover. In the ensuing months damage control on the KDE site has been unable to resurrect my old posts, while the subject of educational technical innovation began popping up everywhere. And so I left my remaining words, links and research to die.
My intentions were revived after reading the December 19 edition of KDE, wherein educational innovation was being considered. I was frustrated by what I read, not because I disagree with innovation in principle, but because the decision makers, and the manner of their discussion, seemed to be, well, old.
(For those who previously consumed the futurework series, there is suggested herein a hint of plagiarism. And while that's true, I am plagiarizing myself, which I *think* is still acceptable.)
The concept of futurework sticks in my brain like an ear worm, it's the filter by which I view a spectacle of our foundering economy. Our nation, and an astonishing list of nation-competitors, are bound to technology as flesh is to blood, and much maligned US competitiveness depends on how a scarcity of revenue is invested. Because current, sputtering economics rewards only those at the top of an income pyramid (who invest and leave their taxes extra-nationally, thanks very much), I suggest that the 99% should *insist* on revenue redirection, not for charity, but for self-preservation. And the first re-direct, IMHO, is education funding.
Thomas Friedman suggested (several years ago, when gas was around $2/gal) that the US should impose real, significant gas taxes. He suggested that these additional revenues be pushed directly to education (without any exposure to the *general fund*), both encouraging less dependence on combusting dinosaurs and promoting long-term US economic success.
Mr. Friedman didn't intend to fund a temporary gadgets solution or school militias. He suggested (and I agree) that teachers should achieve pay parity with, say, doctors, attorneys, et al, to attract the best and brightest. Once we have highly educated, motivated, science-believing, literate, well paid young people *creating the future* (as opposed to being subjects of a ruling class) other problems will be more likely to self-heal (like national stupidity).
I'll presume that most teachers aren't improving their technique just to keep a job, but rather they attempt to equip students with life and career skills (whatever a career is in 2013+). However, and even in the light of information enlightenment, many kids are still forced to suffer inconsistent, boring, pedantic, general purpose chalkboard lectures. While it's true that a teaching career has and may still span a lifetime, it's now also true that a traditional teacher's view of what *really* works must be overhauled and updated, particularly when change is forced by technology. Methods must evolve with the times and the times are a'changin'... *really fast.*
This post was carved in half, more later, please stay tuned...
I'm trying, against considerable odds, to think of words that aren't trite in the face of recent events. I want to address issues more relevant than which phone, TV, tablet to purchase/recommend, even though, and in spite of the tools in my kit, I'm now uncomfortable with all of them. A nagging conscience suggests that broader issues prevail, that I may be part the problem and not the solution.
Today's tech is powerful, some of it even useful. But whether by the unlikely Zombie apocalypse or the obvious effect of climate change, choosing a gift might also include the choice of nothing at all. Not an easy sell to kids who might cry at their relative depravation (all the other kids have one), and nerds (like me) who would express similar frustration with adult-like rationalization (all the other adults have one).
Every device, from stone wheel to large screen TV, requires manufacturing, and destructive, blind consumption of it might portend a dark, dystopian future. I understand, and am also weary of, environmental diminution discussions, because such talk is truly annoying.
Arguments of conscience might actually cause opting out of a new TV for me, iPad for a grandparent or a new vehicle even before the tires wear out. At issue is whether making and disposing of today's tech product is robbing the future to feed the present. While I do believe that we're way past an environmental tipping point, it's also possible that future technology can save us from ourselves. Game on; will an apparent race to the bottom destroy the ability to create cures at a rate that exceeds the consequences?
Like the pain of a broken arm, cognitive dissidence is never pleasant, but hiding behind indifference, or short term reward, isn't appropriate either. If I can afford an iPad, if I've worked hard (or not) and/or have a credit card, isn't said purchase my constitutional right? Don't we live in a free country? Who are these pesky environmentalists that harp endlessly about coral reefs, tainted water and drought? Why should they have a higher argumentative temperature than my own burning wallet?
Perhaps we live in a time of irreconcilable forces. The world economy, such as it is, is said to depend on growth through consumption. We're told that refrigerators, cars, computers, tablets, all manner of gadgets must be renewed, whether we need them or not, and failure to participate will cause manufacturing to stop, jobs will disappear. Sustenance is old fashioned and painful, children can't compete if not connected, twenty thousand music tracks/uploaded photos and constant, always-on communication are required minimums, social covenants, important.
Is all this connectivity really necessary? To quote Steve Martin (twitter), "This is weird, but I’ve found I can read someone’s mind just by talking to them for only an hour or two."
When I was a child we often visited the Spring Lake cabin of a family friend. He kept a station wagon that was used only for transport from the cabin door to the boat dock, about 200 feet. His generation was the Great Generation, born into the depression, then conscripted to fight the second world war. Afterwards these brave soldiers were renewed, they lived in the first robust economy of that epoch, a free market that encouraged consumption and indulgence. These Mad Men weren't consciously destructive, they were merely pursuing the American dream full stop. Even though the station wagon evaporated more gasoline that it burned, and trips to the gas station were infrequent, it was most certainly an unnecessary use of resources. Hidden behind the familiar Ford logo were oil, steel, blood, electricity and other resources necessary to support their relative luxuries.
Even though I know better, I've followed their example like a little duck. My own closet overflows with gadget excess, so it would seem that I haven't really learned a thing. The long view of it makes me cringe. I try to recycle, but it's well known that these programs have a marginal result, that significant, harmful elements of their construction can't be reclaimed by science or alchemy. Much of the gadget waste we produce will end up in third world landfills, picked over by scavenging children and/or dismembered by rain and sun into their component parts, then sent downstream.
Humans are thought to be a superior, evolved species with unmatched survival tools. Brains allow construction of the world's highest building in ninety days, single submarines that can transform the entire planet to dust, technology that records everything, from purchasing habits to heartbeats. Brains allowed an otherwise weak and inferior species (that would be us) to dominate bears, mountains, oceans, mosquitoes and apparently inferior peer civilizations.
Hands are the brain's tool, fantastic strength multipliers that fabricated arrows, steam, super computers, art, bombs, Bushmaster semiautomatic rifles, yadayada. While pocket sized gadgets or sense numbing entertainment centers would seem to be of lesser import than conquering the world or killing children, I wonder; should the dark, destructive results of gadget design be given any less consideration than fossil fuel consumption, over-the-counter assault weapon sales or uncontrollable greed? Do we live in the epoch of ballyhooed industrial revolution that yields a paradoxical result, wherein the devices that claim extraordinary promise might actually destroys us?
I hope not, I love this stuff, I do the best I can.
Or do I? Please stay tuned...
I don't want to be a Christmas stealing Grinch, but because it’s an election year the hype-o-meter began to peg last summer and it’s stressing me out. Silly me, I worry about value, I consider whether products are the best they could be instead of what they are. The average consumer may not notice a quality deficit because of the well-known disease acronym SSF (seasonal feeding frenzy), but I hope expression of my mildly damning opinion might be useful nonetheless.
Before you purchase please consider business plans of these major players:
Per the previous blog post (Meh!), Apple is a hardware company so it must roll out new devices to maintain meteoric capitalization gains. In the past Apple's various laptop and desktop offerings weren't designed as throwaway, in fact there are many older iMacs, towers, even older Mac laptops that are regularly renewed and kept in service. But this isn’t optimal for Apple's business model so its devices are now sold with the expectation of planned obsolescence.
Google (Android/Chrome) is an advertising company, great at collaboration and data mining, but their UI (User Interface) department was originally staffed by old remote control designers (getting better but still crude). They throw stuff at walls, hope for adhesion and some of it sticks (watch the Chromebook, I think it has real potential, very inexpensive). Google's current and future business model is to make people use the web, and they've already amassed a fortune through search. They dabble in software and devices, but it’s all in service of advertising click-through.
Microsoft, is a software company. To maintain profitability (but not increased capitalization, its stock price hasn't really moved since the discovery of fire) Microsoft must sell software. And now we have the next iteration of its operating system, Windows 8.
Short term corporate guidance for consumer technology requires advertising departments concentrate on the next black Friday, while long term earnings (aka the long tail) gradually decline. Several years may seem to be quick turnover, but it’s actually epochal. The 'net, and its corresponding uptick in consumer use, spans significantly less than twenty years for most users. Therefore, an OS (operating system) life of 10 years, or even 5, is huge.
Microsoft, unlike our confused and dysfunctional congress, sort of plays for the short and long term simultaneously. With Windows 8 Microsoft chose not to make a traditional office user appeal (read mouse/keyboard). Rather they've made a major shift, analyzing today's younger users and concluding that they don't give a hoot about anyone's desktop. And why should they? Their lives can be managed, archived (visually, at least), shared and minimally annotated in only a smartphone, so why should/would they bother with a content creation thingy, collaboration, or a winning corporate strategy? Touch is the new mouse, swipe the new UI (user interface), instant, capricious sharing the new privacy. So this currently immature demographic is rightly projected to shape the long tail of business, even though mobile habits don’t upscale well (yet).
Make no mistake, under the hood, Windows 8 has some clear advantages over the current, most used OS, Windows 7. Many of those improvements are devised to mitigate Microsoft's reputation as a virus prone, complicated operating system that shouldn't be gifted to a grandmother. It also shifts user focus to the cloud.
Grammy might be a good candidate for Windows if the platform was intuitive, but it's not. Windows 7 has the best Microsoft UI ever, but has no natural parallel, it's rather an acquired taste. For instance, how might Grammy intuit whether to click twice or once and on what, right click, or drop menus to make complicated command choices (what does save as actually mean?). Credit Steve Jobs, he moved computing into the next big user interface (iPad, iPhone, iPod Touch), wherein much of how people interact with computational devices can be divined by a three year old in five minutes.
Moreover, average consumers don't really need Word and Excel, they want to listen to music, watch YouTube, use Facebook, messaging and Skype, read/comment on Twitter and old-fashioned email (decreasing). While it was once true that a classic, complicated computer was necessary to accomplish basic tasks, it's no longer necessary, so the marketplace is hatching a new species.
Many computers as we've known them are wasting away, some dying in in closets, many more retire to filthy landfills, others are renewable upgrade candidates. Most of their owners grew up with mouse moves and keyboard shortcuts so Microsoft could have continued Windows 7 until Windows 8 was fully baked. After all, Windows 7 is an excellent operating system if properly managed (it often isn’t), but sales projections for an already saturated marketplace won't support outrageous executive salaries and operating/development costs. Notwithstanding those customers looking for an inexpensive computer experience, like cash strapped parents with kids in school, ‘tis the season to be spending. Pandering to the seasonal frenzy, Microsoft forces (on new computers) an immature release, claiming it to be an improvement over 7, so customers should just get over it. Apple releases anything and blinded fanpeople fall over themselves to get in line. But Windows 8 is not causing much of a stir, except for controversy. Good riddance. Darwinian.
Windows 8 is a mobile-centric UI trying to hop the desktop barrier. If Microsoft had only released Windows 8 for mobile only (including phones, tablets, touch-enabled laptops) it might have been a reasonable hit, and if price competitive even a runaway hit. Instead Microsoft looked five(?) years ahead and decided that today's business model (woo consumers but cater to business) is running on fumes, so any move is better than no move at all (perhaps true).
Now if aforementioned Grammy (or any user) can suffer the Windows 8 learning curve on a non-touch device she'll be in less danger than with XP or even 7. Viruses have a harder time getting a grip inside this new OS, applications of questionable lineage are often rejected and any Metro application (like Weather, Calendar, People, etc.), when not on screen, is actually taken out of active memory, much like Apple's iOS handles the issue of multitasking (not multitasking really, it's just taking a snapshot of the application's state and freezing it for future reference). This all combines to make a more stable, less power hungry operating system, but I don’t think mobile 8 will unseat iPad as the runaway winner of the tablet wars this season. Windows 8 laptop/tablet hardware isn’t mature, so devices on which the MS version of intuitive(?) user interface must reside, except mobile phones, are not ready.
So I conclude that Windows 8, when used on any device without touch, is disruptive lipstick applied to a stable pig (Windows 7). When used on a hybrid tablet (touch, keyboard and mouse enabled) the experience does get better.
The RT version of Windows 8 is a pure mobile play, distributed on devices like the Surface RT. It's important to note that regular Windows applications won't work at all on RT; they must be specifically designed for that platform and developers are only nibbling at the bait. RT is designed for devices with limited horsepower but doesn't yet satisfy the needs of traditional Windows users, probably because it straddles the fence between desktop and mobile. Baring some surprising uptick in device improvement (battery life, value) I'm going to wait before purchasing a dedicated, touch-screen Windows 8 device. Hopefully Microsoft will learn the art of successful iteration, read design shift, so stay tuned for Windows Blue.
But for now, if you're purchasing a new Windows 8 computer and it doesn’t have touch, I say fugetaboutit. If you're trying to minimize the financial hit of a new computer and there is a Windows 7 option you'll find some great bargains out there, particularly in laptops. Plus, 'tis not the season for deferring purchases, the American economy seems to depend on it (pathetic). However, if you’re considering an OS upgrade to old hardware, Windows 8 is the least expensive upgrade yet. I've done it on two conventional computers, one desktop with multiple monitors (an upgrade from Windows 7, painless) and one laptop (my Macbook Air). I’m an experienced Windows user and running Windows 8 is a pain, but more like a nagging toothache than a debilitating sickness, your mileage may vary.
Next up: I don't know. Writing about technology is a dish best served quickly.
This time of year I'm asked to recommend technology. One might think that a plethora of choice might be construed as plethora of technical innovation, but I submit that the hype is greater than reality (shocker!). I'm going to explain my bias in several, platform-centric installments, rather than wax adoringly about any of them, because the manufacturers already know how to brag. So (spoiler)... bah, humbug and meh.
Like many mortals, I can't afford to go all in with any manufacturer's version of we're the best for everyone hype, particularly Apple (arguably the most expensive). Instead I must pick/choose a few from each type and infer the rest. I own several non-tablet computers (at least 6) running Windows operating systems, but none with touch screens, making Windows 8 a tough sell (more in the next blog post). I also own several Mac devices, from the Mac Air that is my daily toss about workhorse, an iPod Touch (4G), a Mac Mini (a media center device) and I just gave an iPad (the only one for now) to my daughter (I'm over it and I miss it, will probably go for Mini soon). My phone runs Android and I recently purchased a Nexus 7 tablet because I thought the Android ecosystem might merit a hike through its woods. Unlike most users I must maintain some Windows interoperability with all of them because my day job (IT) is supporting Windows users.
Tech consumers live in an evolving ecosystem, wherein mutants are hatched, then evolve, may the best mutant win by a nose. But therein one finds a sad reality, that is, all the major platforms are attempting to outdo the others by providing eye candy over substance. In Apple's case, a gated ecosystem substitutes for apparent simplicity.
It may take more than a few iterations before clear winners emerge in the traditional laptop/desktop marketplace. Even then I suspect, hope, that a decisive winner is not on today's radar, because the current crop of stuff excites me less than canned beans (with the exception of mobile, the real hot marketplace).
It's easy to get caught up in the new, next best thing. For many, including me, there exists a fascination with a learning the new gadget, assuming no noticeable diminution of stability, safety and base bling from the prior model or type. I found iPad to be a perfect example; a marvelous machine with the sweetness of candy, it invites all manner of tapping, swiping and gesturing, raising a hope that some under-the-hood innovation will just pop out and cause intellectual orgasm. But I crave more than candy, I crave the main course, and iPad didn't deliver (yet).
If you gift an iPad no doubt the giftee will be delighted, particularly cats, children and web consumers. But any serious computer user will want more functionality, a major burn regarding Apple iOS (the operating system for iPads/iPhones, some iPads); Apple dumbs iOS down to enforce arbitrary limitations that aren't defined by hardware/software at all. That's how Apple avoids competing with itself, as giddy, Apple fanpeople will purchase at least one of every Apple device that rolls off the dehumanizing Chinese assembly line (to be fair, most manufacturers have some thumb in the unacceptable Chinese labor pie). I justified the purchase of iPad because I needed a long-lived writing tool; portable, light weight, reasonable screen size, stable. I didn't expect that the iPad/iOS experience would slam the door in my face early on because it seemed, on the surface, an almost perfect device (with the addition of an external keyboard). Not to be outdone by a machine, I painfully worked around the most aggravating aspects, but not with Apple's help; it was third party hardware/software that almost saved my bacon.
Even the Mac Air, arguably the dreamiest piece of hardware ever devised, has lousy battery life, less than half of the iPad, and is bound by a cloistered and restricted ecosystem. Per a comment by Andy Ihnatko of The Chicago Sun Times, Apple innovates at the launch of a product, then iterates slowly. So don't purchase an Apple device and expect that dramatic innovation will be inserted over time.
Which brings up another point. Modern Apple products are becoming tomorrow's trash much faster than their predecessors because chip, screen and battery design can't keep up with an annual shopping season. Apple has sacrificed longevity for appeal and owner turnover, so I predict that my Air might replace several coffee table coasters in about three years. Internal parts, like the SSD (or Solid State Drive, a non-mechanical storage device that replaces the hard drive) and battery are designed for a limited run. So unless your pocket protector is stuffed with proprietary tools and you're willing to pry apart cases/unglue components, the most expensive, arguably most attractive computing platforms (so far) are scheduled to die at the same broad moment. AppleCare, the premium Apple warranty, lasts two years (duh!), standard warranty only one. The battery might even go before that if repeatedly drained beyond the 50% level, but even if carefully cared for lithium polymer technology is nominally restricted by chemistry to about three years. A very expensive (disposable?) device that might not make it through an undergraduate degree. And so, BTW, iPad follows in that path. Not that volatility exists only within the Apple domain, but a Blade Runner-type, replicant death schedule is not enlightened design.
Great machines designed in an atmosphere of self-protection and arrogance; Apple profits from hardware sales (unlike the competition) so hardware must turn over for Apple to continue its meteoric climb. Steve Jobs didn't listen to users, he was a genius and benevolent dictator.
'Nuf said for now, perhaps Tim Cook will open this up.
Next up, several months trying to live with Windows 8.
It's now common knowledge, at least in the geek community, that Team Obama had data, knew how to execute, and Team Romney didn't have a clue. If that playing field alone had been equal I'm not certain that the election would have had the same result. So I humbly submit a premise: Carefully executed technology decided one of the world's most important offices, rather than (and in spite of all the post-election punditry) ideology, gaffs and advertising. If Team Romney had placed any importance on data as a mechanism to drive results, might the outcome have been different?
The Romney campaign, and even their super pacs, had no respect for 2012 technology. Team Obama otherwise constructed an effective, data driven campaign, built with modern tools by brilliant young minds. I watched Team Obama work their software magic almost every day for several months, not as a participant, but rather as voyeur. Team Obama boots (on the ground) often drank coffee (when they could afford it) at my favorite watering hole, where I could peek at their screens and deduce activity. In the final two-three months, their software, pulling from many sources, identified blocks of voters who would be crucial to the count, initiated personal calls to those voters through any campaign worker's cell, and, one by one, offered their version of compelling argument. While Team Romney was firing at imagined targets with BB guns, Team Obama used a laser-accurate, weapons grade recommendation engine. I propose that this raises a question that will linger, that is, if data mining (this cycle) might have been the most important decision element, then who, or what, are we electing when casting a ballot?
Call me old fashioned. I would rather make election choices on the basis of candidate competency and values instead of underlying software genius, even though I depend on trends/memes for insight. Democrats were clearly betting with a full house, while Republicans held two deuces and stuck with a losing hand.
Is this enough to infer other competencies? Perhaps it actually is, or, at the very least, it's possible to judge who might more effectively leverage new tools if handed the keys to a potentially devastating candy store. While I abhor innocent children being decimated by drone bomb droppings, and the incredible technology that drives them, I don't think that less technical competency would cause a shift to humane diplomacy. Similarly, I often consider and argue that China's bureaucracy is populated with technocrats, while US governing bodies are populated with lawyers, and China seems to be winning the race for world dominance, much the same as Amazon outsells and eventually smacks down the neighborhood bookstore. I don't necessarily agree with these outcomes, but I can't ignore them.
Can we change our election strategy to accommodate less algorithm and more substance? Dunno, the horse seems to be out of the barn, just as the world is inextricably connected to automated data trading, commerce, even push button (automated?) war and/or revolution. I do posit that a younger, technically savvy Republican won't make make the same Team Romney data analysis mistakes, although bold, new mistakes are always a possibility.
Or, we could go all in on contemporary, if frivolous, technology: The next presidential election would cost a hellova lot less if we just analyzed tweets instead of spending all this time and money trying to make informed decisions. May the most effective Twitter analysis win?