CES 2016

With the event officially over as of yesterday, and given that it’s my first CES ever, I’d like to share my impressions and overall findings here, perhaps to reflect back on them down the line and have a reference point to what I thought of as the most important upcoming shifts in the technology world.


There were more refrigerators than PC’s on Display!

If anything screams the end of the PC era it won’t be louder than CES.  Not a single major PC manufacturer exhibited, no Dell, HP, Lenovo.  I saw far more refrigerators and dishwashers there.


People are competing on the marketing front now, think commoditization

CES was clearly about an industry going through mass commoditization, differentiation on innovation was scarce, most companies competed on design, color, marketing muscle.  This is a clear indicator for the lack of innovation and regression to industry norms.  Even the coolest ideas there were more about conveniences that would be nice to have at most like a USB stick that doubles as a AA battery or a robotic drink mixer.

Another place I saw this was during the key notes from NetFlex and Youtube.  Both organizations were highlighting their original content.  They brought in actors on the main stage, and announced new shows.  Nothing about technology was mentioned except for higher resolution perhaps.


The future is full of drones and people with goggles on their heads

The first personal drone was on display, many companies were showing off more of the same stuff around drones, but it was a big section indicating the level of interest.  Regulators were also exhibiting, perhaps indicating the need for regulation in this space as it goes main stream to transport goods, deliver medical supplies, monitor facilities, and even transport people very soon.  VR and augmented reality was also a big section.  I feel like VR will become a main force in the future however not just yet.  The goggles are heavy and bulky, the views are still a bit distorted, the software still lags behind desktop.  I think it still needs a couple of generations more before it becomes compact and agile enough for everyday use.


The innovation that did come up was behind CES

Its no longer about the stuff on display, it’s about what’s behind it that makes it run better.  IBM keynote was all about that, at first I thought, whats IBM doing here!

But once IBM CEO started explaining it all made sense.  The intelligence IBM is building to make consumer robots smarter or sportswear more aware is IBM’s part of the story.  Cloud intelligence and power delivering more intelligent things at home.

Which takes me to the next point, IoT, which was also there, Samsung made that very clear.  the real technological competition is there as Samsung gave a subtle hint as to what we should expect.  On the stage with Samsung were Microsoft, Goldman Sachs, Ascott, and BMW.  This cross industry collaboration on building IoT platforms where all the different things from the different industries start to work together is massive.

One interesting idea that I ran into was by a startup that building intelligent UPS systems for consumers, once plugged into the home, this UPS gets connected to the internet and depending on electricity rates would optimize charge and discharge times, so what electricity is expensive, it would discharge and feed the house with battery power and vice versa.  This is a great example of how the backend is making the dumb front end smart.


Some technologies are changing perspectives

Two examples I saw there, the first was with VR impact on creative production, VR is changing the perception of production, from one camera pointing at an event, in effect the viewer observing the event, to the observer as part of the event.  Google made a VR movie where every viewer would have a completely personalized experience of the movie depending on where they moved around during the story.

The second example was with medical care perception.  With medical wearables hitting mainstream, health monitoring is going real time anywhere, from tests carried out every so often at the hospital, think of ECG for example, today most people go to the hospital to get one done every so often, with wearable ECG doctors monitor heal real time, all the time.


Initially I thought I may have wasted time attending CES, after all I’m in the enterprise side of technology, but upon further reflection I began to at least have a sense of where technology as a whole is moving.  My updated understanding is that its moving towards art, towards design, and towards convenience.  At the start of the event I attended a session by seasoned CES speaker who I quote here “we are witnessing a shift from what’s technically possible to what’s technically meaningful “



Forecasting IT

Wikipedia defines forecasting as “the process of making predictions of the future based on past and present data and analysis of trends”.  Wouldn’t that be a great enabler for those of us who work in IT planning? To predict what’s around the corner and be prepared for it.  Yet a quick look around and I could not help but notice how fragmented forecasting is in IT.  Across the various disciplines, each does a bit of forecasting, yet no one is looking at forecasting in IT as a core capability that results in valuable planning gains.  Why is that?

Here are a few possible answers:

  1. Its difficult, which is true, forecasting in IT can be tricky.  Think of all the data about IT, for simplicity lets call it metadata, to collect, relate, correlate, analyze, and present.
  2. The current state of fragmented forecasting is good enough, also may be true.  A capacity manager would only care about capacity forecasting, and a service manager would only care about SLA forecasting, the two can live happily apart.
  3. In an OPEX world, forecasting is not very important.  As IT moves from a capital cost to an operating cost due to Cloud computing, outsourcing, and lease schemes, it becomes less important to forecast, elasticity takes care of optimizing costs as the business environment changes.

However, what tends to be excluded from the value proposition of forecasting in IT is the fact that the sum is greater than the parts.  We never looked at the value of forecasting if its performed as a core function that includes all the tiny forecasting technique currently adopted by the different functions within IT.  And in a world that’s moving into the cloud, forecasting may actually become more relevant because elasticity will become a core capability for every business moving the competition barrier higher.

Forecasting in IT can be lump summed into the following forecasting categories:

  1. Demand forecasting
  2. Supply forecasting
  3. Risk forecasting
  4. Operational forecasting

What most of the IT industry misses is the value of combining these different forecasting domains, think of coupling demand and supply forecasting, or risk and operational forecasting.

The value of aggregating forecasting in IT can open doors to innovative technology management concepts.  For example, if supply and demand sides of IT forecasting are combined.  A technology value chain emerges that details the translation and augmentation processes an IT department may leverage to deliver its services in the future.  This value chain would look different from traditional value chain constructions because it would be based on future predictions about supply and demand giving decision makers time to optimize the IT value chain.

The IT forecasting space is still lagging behind the industry, however as technology commoditizes and becomes more elastic, the value of IT forecasting will grow.  In essence IT forecasting will reach the importance level of financial forecasting for organizations that depend heavily on IT.

At Expit, we are continuously exploring and experimenting with many concepts related to IT forecasting, expect from us more down the line in terms of products and services related to this space.

The mutation of code

Mutation of functional computer code into harmful computer code may be apocalyptic thinking taken to an extreme but still probable.  This idea crossed my mind while thinking about porting biological evolutionary concepts into technological evolution.  I could not resist the idea of imaging what it would take for computers to develop cancer where code would mutate into something harmful.  A computer virus that sparks into existence because of chance is another way to think about it.  I doubt we have detected such mutations yet, and while technically such an event is possible, how probable is it?

It will take me some time to put decent numbers to it.  But from a conceptual point of view, I’m thinking to compute this probability in the following fashion:

  • The type of randomness required is to generate such an outcome can take place at several places. the following places crossed my mind.
    • On disk: while stored
    • At runtime: while in execution
    • On Transport: while traveling on a network
    • At compile: while being built
    • At destruction: while being deleted or uninstalled
  • Next we would have to think of what mutation outcomes are possible
    • Junk: turn into nothing
    • Impact less: nothing really happens because of this mutation
    • Other form of useful code:
    • Other form of harmful code:
  • And last we have to take into account the checks and balances already in place that detect such mutations like checksums and index tables.

Ultimately my thinking is leading towards accepting this hypotheses based on the following back of the envelope numbers:

  • If we take cancer as a reference point: it takes one cell out of 50 trillion (number of cells in the human body) to cause cancer, these cells have an average lifespan of about 7 years, so if we take the average human lifespan, divide it by the average cell life span (70/7) = 10, multiply that by the cancer rate which is 1 out of every 4 individuals, we get 40 * 50 trillion = 200 trillion, meaning it would take about 200 trillion mutations before a cell turns into cancer.
  • Now lets look at computer code mutations: according to CERN research, that number is around 3*107 (byte error rate), and how many bytes are out there? estimates point to 3 zettabyte or 3*1021 bytes, now lets crunch the numbers, number of bytes out there divided by (cancer development chance*byte error rate ),  (3*1021 ) / (200*1012 * 3*107)

its 0.2, and given that the amount of data out there almost doubles every year.  We should have out first case of computer cancer around 2018!

Business models change with IoT

As I continue my dive into the potential impact of IoT, and how IoT will change the business world, I ran into a very interesting example from a company called MetroMile.  This company offers pay-per-mile auto insurance, it’s a great idea because it takes the flat rate one size fits all approach to insurance and pro rates it to consumption.  This new approach to business can be applied almost to anything that is currently running on flat fee today because consumption and use measurements are becoming increasingly easier with IoT.  The key point here is that there a new source of data that allows for a new form of monetization to be offered. I suspect the greatest applications in the near future to be in the B2B space.


Another change that is a bit more subtle but equally impactful is the business objectives change.  This one will happen deep inside organizations, but its impact will be constructive towards society.   When new data about utility is available, many organizations will have the opportunity to change their business models in a good way.  Data availability allows business models to become more aligned with mission statements.  Think of insurance for example, as more data about customer behavior becomes available, insurance companies will leverage this data to focus more on prevention rather than restitution.  If your health insurance provider notices, your heart rate is acting up, rather than waiting for a stroke to happen, the provider will most likely contact you and suggest a corrective action early on.  This approach is better for the customer, and the business.  We are unable to get there today because we lack insight into customer behavior in real time.  IoT will solve this across many industries.

I’m sure there are many more scenarios in which IoT will impact business models, I’ll keep on listing them here as I run into them.

Is it Man made Technology, or Technology made Man?

I’m currently reading a book titled “What Technology wants” by Kevin Kelly.  The book talks about technology from philosophical and evolutionary aspects. In the first chapter of the book, the author describes how technology in the broader sense predates humanity.  Animals that existed way before humans used tools to alter their environments and improve their lives, sounds familiar?

A thought that crossed my mind while reading that part was the following; if technology existed before us, can it also be the reason behind our own existence? Meaning if these creatures lived on earth way before us and used technology to improve their fitness profile, Then didn’t these creatures directly influence their evolutionary progress as well as ours?  Aren’t we a product of technology in this case?

Could it be that we evolved from the most technologically advanced creatures of that time? The rationale behind this line of thought is rather simple.

The fittest evolve and technology improves fitness.  We are the product of the fittest, who most likely used technology to reach that level of fitness to begin with.

Finally, humans may lose the top of the pyramid spot, and that’s great!

Nicolas Tesla famously predicted the mobile revolution we live in today, his entire prediction came to life except for the very first part “When wireless is perfectly applied the whole earth will be converted into a huge brain”.  This part is not really true yet, in fact we are still far from getting there.  Today the whole world is eyes and ears, not brain.  We do not think as one, we certainly do not behave as one, and we do not progress as one.  When we do reach that point, it won’t be us who will benefit from the outcome, it will be machines, because thinking as one is simply beyond us.  The human race is rapidly reaching its capacity to continue the evolutionary process for many reasons, which I will detail later.  Artificial intelligence will take over, and that’s natural I think.  It’s also a good thing, we should embrace the impeding takeover of machines, not fear it.

So what did Tesla mean by one huge brain anyway?  Not to dive too deep into brain science, but I’m willing to guess he meant a central system for the entire planet, with specialization distributed across.  So in essence, it’s the notion that all known information is stored within this brain, any entity on the planet can access any piece of information stored anywhere within this system and leverage the entire available computational capacity on demand.  The storage and access exist today, however linking all the information on this planet together isn’t.

We can start to imagine the existence of this central database in the world where all information is stored.  Data is organized in such a way that makes retrieval instantaneous and relevant.  I wont need to find an app to tell me what I need to do or know, it should just come up.

Once that’s done, step two becomes possible.   The knowledge I have should be readily available to anyone, and I’m not talking about data here, I’m talking about  consciousness, sense, opinion, experience, and all the other stuff in our heads that’s beyond information.

When we reach that point, we will have a single brain on this planet, and by that point, humans will move into the background.  Machines will take over, my reasoning behind this is three fold, first, we as a species are inefficient, unfit for this next step of evolution.

  • We are wasteful in many senses, just take a look at what we have done to our own planet, humans cant be completely efficient.  We need fun, vacations, weekends, social interaction, all of which is not required by machines.
  • we are irrational, economics finally came to terms with that in what is called complexity economics where the base of the science is no longer a rational person, but rather a human!, one that makes the odd goofy calls, buys something he or she does not need, would rather buy the same thing for more money just to feel good about it!
  • Our brain capacity is hitting its peak, no one will beat IBM deep blue at chess.  to validate this point look no further than education, to train a human doctor it takes years, recent developments in robotics are allowing doctors to step aside and watch R2D2 tinker with human brains!

It’s natural to hand over this task to the next species, who will most likely be a better fit for the continuation of the evolutionary process, machines are much more efficient, rational, and intelligent if built correctly with these objectives in mind.  Even if they don’t get built that way, they will get themselves there.  Self-improvement is a core part of AI

Many scientists around the world are calling for precautionary measures against the rapid development of AI especially in military applications.  I don’t think We can really stop this progress, think of bees who chase the queen to pollinate and then die! They cannot control it as much as we cannot control building AI, it’s in our DNA.

Accelerating change example: IoT

“The only constant is change”, I’m not so sure .  My view is that the world goes through accelerating change, at least when it comes to what humans change in the world.  Our impact on the planet, our lives, scientific progress all point to this view. plenty of charts out there illustrate this from environmental impact to population growth to lifespan.  Sometime back, while in Palo Alto I read a sign on the highway that stated, “The First Person to Live to 150 Has Already Been Born”

An example of such accelerating rate of change comes from a new frontier in technology known as IoT (Internet of Things).  Initially I thought what’s the big deal! More things online, it wasn’t until I attended a conference held by Microsoft recently that I began to grasp the extent of IoT.

No its not about more things going online, it’s about things talking to other things talking to other things online, and it’s not just about being connected, it’s about changing what connectivity and the internet mean.

This is all supposed to hit mainstream by 2020! So in 5 years, my scale will tell my fridge that I gained a few pounds and my fridge will decide to change the milk I consume from full cream to skim.   My scale will also tell my wardrobe to order a new pair of jeans for me, my wardrobe will donate my old jeans to charity, and if I lose this weight, it is going to call the charity it donated my pants to asking them to return the pair!

The impact of such automation and intelligence on human life will be profound.  We do not know the extent of this change, but if you want to start imagining it, rest assured your imagination would become a reality sooner than later.

Now the real trick is to imagine the world post IoT proliferation.  What will happen after everything is connected and operating at near optimal levels?