Friday, April 22, 2016

The End Game

The worry about the technological "singularity" is going in the wrong direction.

In my mind, the issue isn't so much of artificial intelligence, as it may become "sentient", going rogue and wipe out or enslave mankind.

The bigger risk appears to be that of powerful humans will leverage this exponentially powerful capability toward the end the powerful always have: to exploit & oppress the lesser capable populace.

Pretty much any historically modern technology has also been used against people (as well as for their dual-use nature application toward civil/peaceful ends): printing press, telephone, cameras, air planes, television, nuclear power, computers, social media, air planes, etc...


If modern super-computing capability's most visible "accomplishment" was the front-running of retail investors by flash-crash causing Wallstreet traders, then why wouldn't Artificial Intelligence (among peaceful uses) not also be deployed for exploitation & oppression purposes toward general society? ("to extract value" as some business literature has it)  How different is that versus the exponentially complex and incomprehensible legal code that imposes ever more consequences on people without them understanding what their rights and recourse are?

The real issues with the technology singularity (as potentially manifested with super-AI) are ...
  • speed (for manipulations to happen so fast that they may fly under the radar of human perception, avoiding the opportunity to even disagree with what happens), 
  • complexity (pose as something else without human chance to look through the real intentions/behavior, 
  • imposed one-sidedness of actions (no effective feedback channel, no negotiation opportunity). 
 All this combined takes any meaningful ways for the average person to analyze, disagree, and oppose what happens to them in the long run.

Computers, software, and websites & mobile apps are already being leveraged at grand scale to one-sidedly impose processes on average people. Call center software that doesn't give much human choice outside of its rigid menu, automated emails that cannot be responded to, phone auto-dialers spamming without any feedback channel, no customer service relationship that gives a customer a sense of true negotiation power. Dynamic pricing that changes with such non-intuitive logic, apparently to confuse the shipper so much that their primitive impulses are tapped to "secure" a price before it changes again.

Business has always been built around manipulation. The convergence of exploiting humans' basic instinct to judge & behave in social context, to crave validation & personal identity, and the state of the art technology enabling massive social reach (what is called now "social media") accelerates the manipulation effectiveness & power to levels previously only seen in primitive societies submitted to divine governance (often by proxy of the chosen onew/few) or fascist (controlled by fear/violence)
 nation states.

The literature for marketing techniques is full of neuro-marketing and cognitive-behavioral manipulation to drive the quick sale - suppress any conscious thought, pull emotional strings, trigger impulses, bypassing rational judgment. Go look it up, it's out there in the open.

Given computers' long history of helping automate the tedious & complex, why wouldn't AI be used to further these self-serving *human* ambitions? That's what we really should be afraid of.


The speed, complexity, and uni-directional channels of highly sophisticated collective machine intelligence (the web, the cloud, the app, the cool tech-company-as-a-brand, the cult of technology & disruption as proxy for progress & freedom) make-believe an open/free/democratic society, while at the same time providing the ruling class with the same power & control over the masses as much cruder methods in times gone by.

The true risk of advanced AI used as a tool is that this time people may not even realize what is really going on, since they are cognitively & perceptively outsmarted, with the technology just being the intermediary, the tool for those with the ambitions. Those in power are usually paranoid enough about maintaining their power base, that they will make damn sure that their tools will not become independent enough to threaten their true masters.

Ultimately, the enablers for all this are the masses, who traded in their true freedom for convenience. Turned off their thinking capacity for comfort, willingly yielding access to the city to the Trojan horses.

The sophistication growth curve of AI is being met by a steadily declining human cognitive capacity, that increasingly prefers simplicity and homogenized media & communication (crisp clustering of preferences, and ever more polarized attitudes make manipulating behavior easier).

In the end, if everybody is so lulled into this new world, and the exploitation is such a seamless facade, perhaps it doesn't matter. If it just weren't for those that feel the economic inequality. But perhaps that "nuisance" will soon be optimized away by that glorious technical singularity.





Continuity versus Disruption

Disruption at all cost?

Technology-driven "disruption" of existing business/economic/societal/political models has become a popular theme among a small but influential group of technology & business leaders, most notable concentrated in the Silicon Valley.

And amongst its fervent supporters, it seems to have become a religion of its own, the technology-driven business model "disruption" movement. Suddenly every problem is merely a technology solution away from being fixed. An obsessive focus on technology often forgets our roots in the humanities, especially history. Among this self-proclaimed elite, status-quo is unilaterally considered bad and must be unquestionably subject to change, no matter of whether previous methods actually work, and without appreciation for the historic context that motivated past and existing methods. And as a result we often end up re-inventing the wheel.

Whether something works is now unilaterally determined by a small group of people who seem to have a rather undemocratic view on what justifies change and what broader society should need.
What is masked as a pragmatic progressive spirit really seems more a a self-justifying zeal that is less receptive to pluralistic examination.

The bigger challenges, wicked problems, get less attention. The ones that could really stand better tools to understand and manage their complexity. Instead yet another website for mundane tasks, or another "social" app is hyped and valued out of proportion to its actual economic impact.

In reality, the ideology of "disruption" is used for extracting value from the masses into the pockets of the few, the disruption cult. It is more destructive than creative destruction, to the benefit of a few, at the expense of most - a zero sum game, winner takes it all. The larger phenomenon of economic inequality is strongly affected by this trend. Instead of progress, it feels more like "value extraction" (a.k.a. theft, just a new form of it).

But we must be careful about the consequences of this "disruption" mantra. How far do we want to go with destabilizing society & the economy? What is the end game? So far we've only heard about the means. The story how we all are truly better of is fuzzy and unconvincing.

A generation of act-first-think-later folks ignore (if they have ever learned) the lessons from the past.
The troublesome trend is in the disruption movement having become more change for change's sake then broadly and pluralisticly justified beneficial outcomes, that have the endorsement of the broader populace, as it behooves a democratic society. What's missing is the sense of involvement from the affected.


At the core, most humans desire stability. When broad masses of people suffer "future shock" (Toffler, 1970) there is the risk of a societal backlash, possibly akin to what happened in the early 1930s. The more radical change is imposed to people, or sneaks up upon them without them initially realizing it, the more people will be opposed to change.

In general, people are actually quite tolerant to change - as a species we have always adapted to changing circumstances, or we wouldn't have come this far in the history of the earth. The issue is with too abrupt of change, that people don't understand, can't relate, and most importantly, the kind of change that leaves them with loss (of income, of housing, of identity, of social empathy).

Change aversion is usually rooted in bad experiences where change lead to loss. And the more people with such sensitivities are dismissed un-empathetically, the more of a counter force to progress they will become, in somewhat of a vicious cycle. Class warfare entrenched.

Disoriented people who feel they have nothing left to lose make for the most dangerous of a society, one that will be "disrupted" in the most explosive ways from which it usually takes a generation to recover. People will become overly territorial, tribalization gets empathized, discrimination and persecution increases of those being different (often as proxies for the real culprits)

An a qualitative difference in modern change is that it is human-caused, not by the natural environment. Nature's changes don't plot against humans, they just happen, and therefore are more predictable. Modern societal & economic changes area often engineered. The add an element of game theory that requires anticipation of the motivations of others, which ends up in a reciprocal rat race spiral, an economic/social arms race, so to speak.

The human-engineered change poses two levels of risk....
The rate of change, due to its reciprocal nature, is exponentially speeding up. And much of the imposed change perceived as destructive, it therefore erodes the trust and goodwill needed to function as a society (protectionism leads to hostilities and oppression of those being different than one's chosen tribe, even if those aren't the cause for the destructive change)


I propose to focus more on CONTINUITY, not rigid sticking to status-quo, but making change more gradual, harmonic, with smooth transitions allowing to bring people along, allow them to find a role for themselves in a changing world.

Ever more accelerating change, driven by and benefiting a few, while leaving the masses behind, will eventually cause a massive back lash, from which even the original beneficiaries won't be safe. So it should be in even their interest to consider how sustainable the current trend of obsessive disruption actually is.

Historically, disruption was called revolution, and it was hardly ever the least painful human experience. As smart as we are today, our brightest minds should really solve the dilemma of facilitating change for the better (true progress) in the least destructive fashion.

Yes, in nature's realm, often the old must die before the new can flourish. But isn't one of the hallmarks of humanity that we transcended raw nature in it's primitive mechanisms? Isn't what makes for our humanity the ability to solve complex problems that threaten our species survival in the long run?

With our increasing understanding of natural mechanisms, why don't we strive more for "evolution" instead. Change emerging that way may be far more effective and sustainable. Part of that would be a bit more recognition of the "shoulders of the giants who have come before us" and on whom our progress stands. A bit more historic context appreciation instead of singular futurism. Mindfulness of whence we came, so as to avoid running in the circles of repeating previously made mistakes.

Continuity - in a sense of moving forward, instead of just keeping busy (with destroying). The focus should be on enhancing, improving, not "disrupting". For most of us, the notion of destroying stuff doesn't sound as promising compared to making things better, gradually, convincingly.

As it is, the current trend of Disruption may cause more problems than it aspires to solve.

Continuity seems a good compromise between non-functional status-quo, and blind disruption for change's sake. It gives room for improvement where needed and opportune, and it honors time-tested methods that have been working and which provide the stable backbone supporting the change needed to adapt to contemporary challenges.



Wednesday, April 20, 2016

... and then what?


Thoughts on a future culminating in a singularity.

Light has duality - being a particle, and a wave.

The future has duality too - it can be a trend (of moving forward in time) or it can be an end-state.

Ever accelerating ambitious socially/human-driven change (most recently manifested in the maniacal "disrupt" movement) leads one to assume that there is an end-goal in mind, an ambition to achieve, a place to reach, and outcome to get closer to.

Once that is reached, accomplished, arrived at…. And then what?

Turning around the popular "what's next?" theme (commonly applied to futurism) into… what's next after the future has manifested?

In this blog, we'll entertain and explore the concept of societal/cultural singularity - when things can't get any more extreme/faster/newer/more different - when the end is reached, whatever that may be.

The obsessive yearning for ever faster change, in pursuit of "the future" is a manifestation of greed - not ever getting enough/more in limited (life) time, hence everything must happen faster, so there can be more had of it. Greedy for time, greedy for more experiences in lesser time, so there can be had more. Faster, more, better - the same mental reward circuitry running awry as in drug addicts. The culture openly promotes addictive behavior, with the literal wording "something to get addicted to", "binge watch…" etc.

To what extremes may this obsession and greed lead? Or what setbacks will result from over-doing it, going beyond the sustainable? So far every hype, every bubble eventually came down and burst. The faster, the larger the change movements driven by obsessive fervor, the more disastrous the outcomes, as many historical social tragedies have demonstrated (world wards, totalitarian ideologies, etc.)

An interesting question to explore.... how much and how fast of societal/cultural change is sustainable before it causes a backlash (and what is the criteria for recognizing/acknowledging a "backlash" as something undesirable, or can there even be consensus?)


Future Legacy

Legacy versus Liability.

In personal context, "legacy" is considered life time achievement.
When referred to in technological/societal context, "legacy" are the mistakes from the past we still have to deal with, like nuclear waste, asbestos-laced buildings, federal budget deficits & national debt, MRSA, environmental abuse, tribal wars & genocide - the unintended consequences of previously ignorant decisions.

What are we doing today, what are we building today, what decisions are we committing to today, that will have adverse outcomes, unintended effects, as the future emerges.
Some of them will shape the future. Others will clash with the original assumptions we had how things would be, and when a new reality sets in, the decisions made based on old/narrow/short-sighted assumptions will turn into liabilities.

While there can be deliberate improvements to the state of affairs, often dealing with bad choices from the past amounts to just keeping the ship afloat by scooping out the breaking in water, but never getting around to actually fix the holes that cause the leaking in the first place.

This article is about how our decisions today will contribute to problems in the future, but may not be obvious until the future arrives. Then in the future, we may regret the short-sighted decisions we may have made in the past. I will make the point that we should be more mindful about how what we build today, what we decide in the present, may become a liability in the future, and how to mitigate this with more mindfulness, more flexible assumptions, and a nurturing process to upgrade our assumptions and the solutions we come up with.

We are ever more focused on fixing our mistakes from the past than actually making genuine progress. And on the other extreme, we hastily jump into future opportunity without considerations how these decisions will play out in a larger context (broader as well as further out consequences).

We live in the present, now. What was before is the past, and what will come next is the future.
What from the current moment, the now, appears as the future, will in the future be its own present. And what used to be our current present, will fall into the past as we arrive in what is today seen as the future, and when it arrives, turn into the then-present.
Past->Present->Future are relative concepts. Our consciousness is largely rooted in the present, the now, the moment plus/minus a window of a few hours/days (how each of us interprets the present can be variable as well).

Many visions of the future seem to anchor on how developments from the present lead into the future. Another way of considering the future is how what is now the present will become the future history. The current present will become the past in the future.

In the IT industry there is this concept of "legacy systems", referring to outdated, unsupported systems and applications, which once had their purpose, but as life moved on, their usefulness has become obsolete. Contrary to the concept of "legacy" in reference to humans, which implies some sort of life-time accomplishment, in IT culture "legacy" is a rather derogatory, implying outdated, useless, cumbersome to keep up for no good reason - much more fun to do new things, try out state-of-the-art technology, tools, toys.

Yet, we keep repeating the cycle of building future legacy systems, shiny new "solutions" in our respective present, often with little regard for how in time these solutions will become outdated, obsolete, and then have to be cumbersomely maintained, or gracefully shutdown/transitioned.

Given that our modern civilization is heavily dependent on complex webs of technology systems and software applications, the vulnerability to aging solutions, losing their effectiveness as life has grown beyond their meaningful use (yet dependencies on it still may exist), the effect of aging technology infrastructure is becoming as relevant as similar phenomenon in structural engineering, such as collapsing bridges and leaking nuclear waste sites.

The contamination of ever faster moving society & technological change with static outdated critical support systems built for simpler times, on more stable assumptions, is becoming a real problem, to the extent that there is a whole set of professionals being paid to take care of old systems.
In the beginning of my software developer career, I got paid good money as a young professional for reverse engineering old software and transitioning (refactoring) it onto more modern platforms, to enable contiunity for businesses operating on that aging and unsupported application platform. My wife holds a well compensated and highly regarded job within a major healthcare company, focusing on what they call "sunsetting", gracefully decommissioning old servers while ensuring regulatory compliance to record retention and patient data governance. Millions of dollars are being spent just for continuing the same business process while swapping out the underlying degrading support systems.

Given all this, I see an opportunity to consider how moving into the future will change the position and status of that which doesn't change on its own. How things age without really changing themselves, but that the change of life around them makes their standing still look like going backwards, becoming brittle, like material fatigue, except software doesnt' really "age" physically.
(except perhaps for the concept of "bit rot")

Tuesday, March 1, 2011

The Velocity of Knowledge

It is interesting to see the dichotomy of old-school thinking versus the latest trend when it comes to best practices, concepts and rules. Each side's proponents tend to claim absolute truth for their movement, when the common sense lies somewhere in between. Just like the classic division in politics between the conservative and liberal ends of the spectrum, the most pragmatic approach tends to be somewhere centrist.

So I was thinking about the speed at which knowledge is created in this hyper-linked, twitter-agile information age. Not only is new knowledge created, often it is at conflict with existing knowledge, with established wisdom. And at the speed of change, not often is it clear whether the new is really superior to the old, warranting rapid obsoletion of the past.

Again, the middle ground is probably somewhere in between. And then there is the great "it depends". The very same knowledge can be applied to different perspectives, warranted in one, but not in the other. The rapid trail & error approach to online marketing is certainly not conducive to build airplanes or nuclear plants. Yet, it's hard to stay competitive in any online or media business these days with a velocity of molasses in January.

Along those lines I was thinking that people, society, would be well served to establish a knowledge base, not driven by the latest version number (newer is not always better), but by merit, and/or practical use. Perhaps along the lines of the crowd-sourced media judgment, and product rating by online buyers. Whichever new concept/approach works well, or better over a prior version, will quickly establish itself via the broad acceptance that crowd-ratings would provide.

Perhaps akin to to some metrics used in the academic space as to how many times a publication is referenced. I understand Google does the same on a larger scale, but the agents creating the links often have dubious motifs that don't necessarily indicate the quality of the source content. This is an effect that increases as Search Engine Optimization techniques become more prevalent. Will the Semantic Web reset some of these patterns?

Possibly, Wikipedia may be progressing toward this philosophy. What I am missing there today is a tangible measure of how established the knowledge is, how proven versus how volatile (as in "still evolving) it is.

For sure is, any heavy-handed, top-down management attempt will fail. The solution lies somewhere in awareness, a culture so to speak, in all the participants of the vast network of knowledge.

It may be evolving one day out of the same technology and drive that today is used for tracking peoples' activity on the Internet, but for a broader benefit than just online marketing.

Monday, February 28, 2011

Why is it so hard to predict the future?

There is this time machine paradox. If you were able to go back into the past to change things to make things "better" in the present, you may end up altering the course of history that lead to the developments that caused you to exist, or that made the technology available to you to time travel. A recursive dependency, ending up in a logic trap we can't wrap our limited brains around.

This is the same reason why it is extremly difficult to reliably predict what will happen in the future. The more precise, tangible we try to pin down future events, the more likely we will affect the outcome leading to an alternate, different future. At least if we communicate our prediction to those involved in the assumed future outcome.

The stock market is a good example of that. If someone spreads a "secret" tip, how to get rich quick in the stock market, many people will follow, do the same thing, and in effect eliminate the niche opportunity by their very actions of pursuing it.

If we develop a hypothesis as to how the future will shape up, it is most stable if the factors that influence that hypothesized future are unaware of our theory. If the factors involve self-aware agents, like people, they may adapt to intermediate events, or even the expected outcome, and by these actions possibly avoid the assumed future event/outcome.

On the scientific side, popular examples in this real are the Heisenberg's Uncertainty principle, and Schroedinger's Cat thought experiment. As Quantum Computing will advance, interesting possibilities may shape up for predicting the future. Not necessarily to our satisfaction. The human mind is a paradox in itself. If you ever pondered "how big is the universe?", what if I told you it is <this> big. Then the logically next question would be "where does it end?", and I tell you it ends <there>. "And what comes after that?".

If first-graders already ask such pertinent questions, what can our highly developed adult minds come up with?