Category Archives: Analytics

Security in IoT

@ 4th Annual Nordic Cloud & Mobile Security Forum in Stockholm

 

Internet of Things
as I understand it

IoT emerges at the interaction of Semiconductors, Telecoms, Big Data and their laws. Moore’s Law for Semiconductors, observed as 60% annual computing power increase. Nielsen’s Law for Telecoms, observed as 50% annual network bandwidth increase; Metcalfe’s Law for networks, observed as value of the network proportional to the squared number of connected nodes (human and machines, many-to-many). Law of Large Numbers is observed as known average probabilities for everything, that you don’t need statistics anymore. On Venn diagram IoT looks smaller than either of those three foundations – Semiconductors, Telecoms and Big Data, but in reality IoT is much bigger, it is digitization and augmentation of our physical world, both in business and lifestyle.

1_IoT

How people recognize IoT? Propably some see only one web, some see another web, others see few webs? There are good known six webs: Near, Hear, Far, Weird, B2B, D2D [aka M2M]. Near is laptop, PC. Hear is smartphone, smartwatch, armband, wristband, chestband, Google Glass, shoes with some electronics. Far is TV, kiosk, projection surface. Weird is voice and gesture interface to Near and Far, with potential new features emerging. B2B is app-to-app or service-to-service. D2D is device-to-device or machine-to-machine.

People used to sit in front of computer, now we sit within big computer. In 3000 days there will be super machine, let’s call it One, according to Kevin Kelly. It’s operating system is web. One identifies and encodes everything. All screens look into One. One can read it all, all data formats from all data sources. To share is to gain, yep, sharing economy. No bits live outside of One. One is us.

2_6webs

Where we are today
or five waves of IoT

Today we are at Identification of everything, especially visually; and Miniaturization of everything, especially with wearables and M2M. High hopes are onto visual identification and recognition. On the one hand, ubiqutous identification is just needed. On the other hand, visual recognition and classification is probably the way to security in IoT. Instead of enforcing tools or rules, there are policies and some control how those policies applied. The rationale is straightforward: technologies change too fast, hence to build something lasting, you should build policies. Policies are empowered by some technology, but remain other technologies agnostic.

3_5waves

Fifth wave is augmentation of life with software and hardware…

Who is IoT today? Let’s take Uber. Today it is not. In several years with self-driven cars it will be. Tim O’Reilly perfectly described IoT as ecosystem of things and humans. Below is comparison, with significantly extended outlook of tomorrow.

4_uber

It is great step towards personalized experience that Uber linked Spotify to your cab, so that you experience your individual stage in any Uber car. More about personal experience in my previous post Consumerism via IoT, delivered in Munich.

IoT Reference Architecture
or magic of seven continues

Well, high-level mind-washing stuff is interesting, but is there a canonical architecture for IoT? What could I touch as an engineer? There is reference architecture [revealed several weeks ago by Cisco, Intel and others], consisting of seven layers, shown below:

5_7layers

Notice that upper part is Information Technology, which is non-real-time, and which must be personalized. Lower part is Operational Technology, which is real-time or near-real-time, and which is local and geo-spread. Central part is Cloud-aware, which is IT and it’s centralized with strategic geo-distribution, with data centers for primary internet hubs and user locations.

From infosec point of view, top level is broken, i.e. people are broken. They continue to do stupid things, they are lazy, so it’s not rational to try to improve people. They will drive you crazy with BYOD, BYOA and BYOT (bring your own device/app/technology). It is better to invest into technologies which are secure by design. Each architectural layer has own technological & security standards, reinforced by industry standards. Really? Yes for upper part and not obvious for the lower…

Pay attention to the lower part, from Edge Computing and downstairs. It is blurred technology as for today, it could be called Fog. Anyway, Cisco calls it Fog. The Fog perfectly reflects the closest cloud to the ground; encapsulates plenty of computing, storage and networking functionality within. Fog provides localization and context awareness with low latency. Cloud provides global centralization, probably with some latency and less context. Experience on top of Cloud & Fog should provide profiling and personalization, personal UX. The World is flat. The World is not flat. It’s depends on which layer of IoT you are now.

Edge of computing
or computing at the Edge

Data growths too fast, that in many scenarios it simply can’t be moved to the Cloud for intelligence; hence BI comes to the Data. Big Data has big gravity and it attracts apps, services to itself. And hackers too. Gathering, filtering, normalizing, accumulating data at location or elsewhere, outside the cloud, is called Edge Computing. It is often embedded programming of single-card computers or other mediums (controllers, Arduino, Raspberry Pi, Tessel.io, smartphones when much computing power required).

6_fog0

 

Fog Computing
or cloud @ data sources

Fog Computing is a virtualized distributed platform that provides computing, storage, and networking services between devices and the cloud. Fog Computing is widespread, uncommon, interconnected. Fog Computing is location-aware, real-time/near-real-time, geo-spread, large-scale, multi-node, heterogeneous. Check out http://www.slideshare.net/MichaelEnescu/michael-enescu-cloud-io-t-at-ieee

6_fog1

Fog is hot for infosec, because plenty of logic and data will sit outside of the cloud, outside of the office, somewhere in the field… so vulnerable because of immaturity of IoT technologies at that low level.

Secure Fog Fabric
or security by design

How to find or build technologies for the Fog Computing, which would be secure by design? Which would live quite long, like TCP/IP:) Is it possible? Are some candidate technologies exist so far? And potentially they should be built on top of proven open-sourced tools & technologies, to keep trust and credibility. It all must synergize at large collaboration scale to breakthrough with proper tech fabric. So what do we have today? Fog is about computing, storage and networking, just a bit different from the same stuff in the cloud or in the office.

Computing. Which computing is secure, transactional and distributed? And could fit onto Raspberry Pi? Ever thought about Bitcoin? Ha! Bitcoin’s Block Chain algorithm is exactly the secure transactional distributed engine, even platform. Instead of computing numbers for encryptions and mine Bitcoins, you could do more useful computing job. Technology has all necessary features included in it by design. Temporary and secure relations are established between smartphones and gadgets, devices and transactions happen. Check out Block Chain details.

Storage. Data sending & receiving. Which technology is distributed, efficient of low-bandwidth networks, reliable and proven? BitTorrent! BitTorrent is not for pirates, it is for Fog Computing. For mesh networks and efficient data exchange on many-to-many topologies, built over P2P protocol. BitTorrent is good for video streaming too. Check out BitTorrent details .

Identification. Well, may be it’s not identification of everything and everyone, but authentication and authorization is needed anyway, and needed right now. Do we have such technology? Yes, it is Telehash! Good for mesh networks, based on JSON, enables secure messaging. Check out Telehash details.

6_fog2

Fog Computing is new field, we have to use applicable secure technologies there, or create new better technologies. Looks like it is going to be hybrid, something applied, something invented. Check out original idea from IBM Research for original arguments and ideation.

Security for IoT

A proposal is to go ahead with OWASP Top 10 for IoT. Just google for OWASP and code like I10 or I8. You will get the page with recommendations how to secure certain aspect of IoT. The list of ten doesn’t match seven layers of reference architecture precisely, while some relevance is obvious. Some layers are matched. Some security recommendations are cross-functional, e.g. Privacy.

7_OWASP

For Fog Computing pay attention to I2, I3, I4, I7, I9, I10. All those recommendations could be googled by those names; though they are slightly different at OWASP site. Below is a list of hyperlinks for your convenience. Enjoy!

I1 Insecure Web Interface
I2 Insufficient Authentication/Authorization
I3 Insecure Network Services
I4 Lack of Transport Encryption
I5 Privacy Concerns
I6 Insecure Cloud Interface
I7 Insecure Mobile Interface
I8 Insufficient Security Configurability
I9 Insecure Software/Firmware Updates
I10 Poor Physical Security

More about Internet of Things, especially from user point of view could be found at my recent post Consumerism via IoT.

Tagged , , , , , , , , , , , , , , , , ,

Consumerism via IoT @ IT2I in Munich

We buy things we don’t need
with money we don’t have
to impress people we don’t like

Tyler Durden
Fight Club

 

Consumption sucks

That sucks. That got to be changed. Fight Club changed it that violent way… Thanks God it was in book/movie only. We are changing it different way, peacefully, via consumerism. We are powerful consumers in new economy – Experience Economy. Consumers don’t need goods only, they need experiences, staged from goods, services and something else.

EE

Staging experience is difficult. Staging personal experience is a challenge for this decade. We have to gather, calculate and predict about literally each customer. The situation gets more complicated with growing Do It Yourself attitude from consumers. They want to make it, not just to buy it…

If you have not so many customers then staging of experience could be done by people, e.g. Vitsoe. They are writing on letter cards exclusively for you! To establish realistic human-human interface from the very beginning. You, as consumer, do make it, by shooting pictures of your rooms and describing the concept of your shelving system. New Balance sneakers maker directly provides “Make button”, not buy button, for number of custom models. You are involved into the making process, it takes 2 days, you are informed about the facilities [in USA] and so on; though you are just changing colors of the sneaker pieces, not a big deal for a man but the big deal for all consumers.

There are big etalons in Experience Economy to look for: Starbucks, Walt Disney. Hey, old school guys, to increase revenue and profit think of price goes up, and cost too; think of staging great experiences instead of cutting costs.

 

Webization of life

Computers disrupted our lives, lifestyle, work. Computers changed the world and still continue to change it. Internet transformed our lives tremendously. It was about connected machines, then connected people, then connected everything. The user used to sit in front of computer. Today user sits within big computer [smart house, smart ambient environment, ICU room] and wears tiny computers [wristbands, clasps, pills]. Let’s recall six orders of magnitude for human-machine interaction, as Bill Joy named them – Six Webs – Near, Hear, Far, Weird, B2B, D2D. http://video.mit.edu/embed/9110/

Nowadays we see boost for Hear, Weird, D2D. Reminder what they are: Hear is your smartphone, smartwatch [strange phablets too], wearables; Weird is voice interface [automotive infotaintent, Amazon Echo]; D2D is device to device or machine to machine [aka M2M]. Wearables are good with anatomical digital gadgets while questionable with pseudo-anatomical like Google Glass. Mobile first strategies prevail. Voice interop is available on all new smartphones and cars. M2M is rolling out, connecting “dumb” machines via small agents, which are connected to the cloud with some intelligent services there.

At the end of 2007 we experienced 5000 days of the Web. Check out what Kevin Kelly predicts for next 5000 days [actually less than 3000 days from now]. There will be only One machine, its OS is web, it encodes trillions of things, all screens look into One, no bits live outside, to share is to gain, One reads it all, One is us…

 

Webization of things

Well, next 3000 days are still to come, but where we are today? At two slightly overlapping stages: Identification of Everything and Miniaturization & Connecting of Everything. Identification difficulties delay connectivity of more things. Especially difficult is visual identification. Deep Neural Networks did not solve the problem, reached about 80% accuracy. It’s better than old perceptrons but not sufficient for wide generic application. Combinations with other approaches, such as Random Forests bring hope to higher accuracy of visual recognition.

Huge problem with neural networks is training. While breakthrough is needed for ad hoc recognition via creepy web camera. Intel released software library for computer vision OpenCV to engage community to innovate. Then most useful features are observed, improved and transferred from sw library into hw chips by Intel. Sooner or later they are going to ship small chips [for smartphones for sure] with ready-made special object recognition bits processing, so that users could identify objects via small phone camera in disconnected mode with better accuracy than 85-90%, which is less or more applicable for business cases.

5WIoT

As soon as those two IoT stages [Identification and Miniaturization] are passed, we will have ubiquitous identification of everything and everyone, and everything and everyone will be digitized and connected – in other words we will create a digital copy of us and our world. It is going to be completed somewhere by 2020-2025.

Then we will augment ourselves and our world. Then I don’t know how it will unfold… My personal vision is that humanity was a foundation for other more intelligent and capable species to complete old human dream of reverse engineering of this world. It’s interesting what will start to evolve after 2030-2040. You could think about Singularity. Phase shift.

 

Hot industries in IoT era

Well, back to today. Today we are still comfortable on the Earth and we are doing business and looking for lucrative industries. Which industries are ready to pilot and rollout IoT opportunities right away? Here is a list by Morgan Stanley since April 2014:

Utilities (smart metering and distribution)
Insurance (behavior tracking and biometrics)
Capital Goods (factory automation, autonomous mining)
Agriculture (yield improvement)
Pharma (critical trial monitoring)
Healthcare (leveraging human capital and clinical trials)
Medtech (patient monitoring)
Automotive (safety, autonomous driving)

 

What IoT is indeed?

Time to draw baseline. Everybody is sure to have true understanding of IoT. But usually people have biased models… Let’s figure out what IoT really is. IoT is synergistic phenomenon. It emerged at the interaction of Semiconductors, Telecoms and Software. There was tremendous acceleration with chips and their computing power. Moore’s Law still has not reached its limit [neither at molecular nor atomic level nor economic]. There was huge synergy from wide spread connectivity. It’s Metcalfe’s Law, and it’s still in place, initially for people, now for machines too. Software scaled globally [for entire planet, for all 7 billions of people], got Big Data and reached Law of Large Numbers.

IoT

As a result of accelerated evolution of those three domains – we created capability to go even further – to create Internet of Things at their intersection, and to try to benefit from it.

 

Reference architecture for IoT

If global and economic description is high-level for you, then here you go – 7 levels of IoT – called IoT Reference Architecture by Cisco, Intel and IBM in October 2014 at IoT World Forum. A canonical model sounds like this: devices send/receive data, interacting with network where the data is transmitted, normalized and filtered using edge computing before landing in databases/data storage, accessible by applications and services, which process it [data] and provide it to people, who will act and collaborate.

ref_arch

 

Who is IoT?

You could ask which company is IoT one. This is very useful question, because your next question could be about criteria, classifier for IoT and non-IoT. Let me ask you first: is Uber IoT or not?

Today Uber is not, but as soon as the cars are self-driven Uber will be. An only missing piece is a direct connection to the car. Check out recent essay by Tim O’Reilly. Another important aspect is to mention society, as a whole and each individual, so it is not Internet of Things, but it is Internet of Things & Humans. Check out those ruminations http://radar.oreilly.com/2014/04/ioth-the-internet-of-things-and-humans.html

Humans are consumers, just a reminder. Humans is integral part of IoT, we are creating IoT ourselves, especially via networks, from wide social to niche professional ones.

 

Software is eating the world

Chips and networks are good, let’s look at booming software, because technological process is depending vastly on software now, and it’s accelerating. Each industry opens more and more software engineering jobs. It started from office automation, then all those classical enterprise suites PLM, ERP, SCADA, CRM, SCM etc. Then everyone built web site, then added customer portal, web store, mobile apps. Then integrated with others, as business app to business app aka B2B. Then logged huge clickstreams and other logs such as search, mobile data. Now everybody is massaging the data to distill more information how to meet business goals, including consumerism shaped goals.

  1. Several examples to confirm that digitization of the world is real.
    Starting from easiest example for understanding – newspapers, music, books, photography, movies went digital. Some of you have never seen films and film cameras, but google it, they were non-digital not so long ago. Well, last example from this category is Tesla car. It is electrical and got plenty of chips with software & firmware on them.
  2. Next example is more advanced – intellectual property shifts to digital models of goods. 3D model with all related details does matter, while implementation of that model in hard good is trivial. You have to pay for the digital thing, then 3D print it at home or store. As soon as fabrication technology gets cheaper, the shift towards digital property will be complete. Follow Formula One, their technologies are transferred to our simpler lives. There is digital modeling and simulations, 3D printed into carbon, connected cars producing tons of telemetry data. As soon as consumer can’t distinguish 3D printed hard goods from produced with today’s traditional method, and as soon as technology is cheap enough – it is possible to produce as late as possible and as adequate as possible for each individual customer.
  3. All set with hard goods. What about others? Food is also 3D printed. First 3D printed burger from Modern Meadow was printed more than year ago, BTW funded by googler Sergey Brin. The price was high, about $300K, exactly the amount of his investment. Whether food will be printed or produced via biotech goo, the control and modeling will be software. You know recipes, processes, they are digital. They are applied to produce real food.
  4. Drugs and vaccines. Similar to the food and hard goods. Just great opportunity to get quick access to the brand new medications is unlocked. The vaccine could be designed in Australia and transferred as digital model to your [or nearby] 3D printer or synthesizer, your instance will be composed from the solutions and molecules exclusively, and timely.

So whatever your industry is, think about more software coding and data massage. Plenty of data, global scale, 7 billions of people and 30 billions of internet devices. Think of traditional and novel data, augmented reality and augmented virtuality are also digitizers of our lives towards real virtuality.

 

How  to design for IoT?

If you know how, then don’t read further, just go ahead with your vision, I will learn from you. For others my advice will be to design for personal experience. Just continue to ride the wave of more & more software piece in the industries, and handle new software problems to deliver personal experience to consumers.

First of all, start recognizing novel data sources, such as Search, Social, Crowdsourced, Machine. It is different from Traditional CRM, ERP data. Record data from them, filter noise, recognize motifs, find intelligence origins, build data intelligence, bind to existing business intelligence models to improve them. Check out Five Sources of Big Data.

Second, build information graphs, such as Interest, Intention, Consumption, Mobile, Social, Knowledge. Consumer has her interests, why not count on them? Despite the interests consumer’s intentions could be different, why not count on them? Despite the intentions consumer’s consumption could be different, why not count on them? And so on. Build mobility graph, communication graph and other specific graphs for your industry. Try to build a knowledge graph around every individual. Then use it to meet that’s individual expectations or bring individualized unexpected innovations to her. Check out Six Graphs of Big Data.

As soon as you grasp this, your next problem will be handling of multi-modality. Make sure you got mathematicians into your software engineering teams, because the problem is not trivial, exactly vice versa. Good that for each industry some graph may prevail, hence everything else could be converted into the attributes attached to the primary graph.

concept

 

PLM in IoT era

Taking simplified PLM as BEFORE –> DURING –> AFTER…

BEFORE.
Design of the product should start as early as possible, and it is not isolated, instead foster co-creation and co-invention with your customers. There is no secret number how much of your IP to share publicly, but the criteria is simple – if you share insufficiently, then you will not reach critical mass to trigger consumer interest to it; and if you share too much, your competitors could take it all. The rule of thumb is about technological innovativeness. If you are very innovative, let’s say leader, then you could share less. Examples of technologically innovative businesses are Google, Apple. If you are technologically not so innovative then you might need to share more.

DURING.
The production or assembly should be as optimal as possible. It’s all about transaction optimization via new ways of doing the same things. Here you could think about Coase Law upside down – outsource to external patterns, don’t try to do everything in-house. Shrink until internal transaction cost equals to external. Specialization of work brings [external] costs down. Your organization structure should reduce while the network of partners should grow. In the modern Internet the cost of external transactions could be significantly lower than the cost of your same internal transactions, while the quality remains high, up to the standards. It’s known phenomenon of outsourcing. Just Coase upside down, as Eric Schmidt mentioned recently.

Think about individual customization. There could be mass customization too, by segments of consumers… but it’s not so exciting as individual. Even if it is such simple selection of the colors for your phones or sneakers or furniture or car trim. It should take place as late as possible, because it’s difficult to forecast far ahead with high confidence. So try to squeeze useful information from your data graphs as closer to the production/assembly/customization moment as possible, to be sure you made as adequate decisions as could be made at that time. Optimize inventory and supply chains to have right parts for customized products.

AFTER.
Then try to keep the customer within experience you created. Customers will return to you to repeat the experience. You should not sit and wait while customer comes back. Instead you need to evolve the experience, think about ecosystem. Invent more, costs may raise, but the price will raise even more, so don’t push onto cost reduction, instead push onto innovativeness towards better personal experiences. We all live within experiences [BTW more and more digitized products, services and experiences]. The more consumer stays within ecosystem, the more she pays. It’s experience economy now, and it’s powered by Internet of Things. May be it will rock… and we will avoid Fight Club.

 

PS.

Visuals…

Tagged , , , , , , , , , , , , , , , , , , , , , , , ,

Color Paradigm Shift

This post will be about colors and emotions. Especially about modern relations between colors and emotions.

Old School?

Take a look at this color wheel of emotions, it is available in high resolution and better quality online as PDF.
wheel

Questions

Let’s question several colors exclusively, e.g. Red, Orange, Violet, Blue, Yellow…

Red

The color wheel says it is about feeling angry, irate, frustrated, aggravated. Really? What about Christmas cards and stripes? They are all red. Santa Claus is all red. St Valentine stuff is all red and it is about love. All wedding packages are using red for love and passion. On metropolitan maps you have “You are here” in red. On Google Maps you have red location balloons. Women’s lips are red. Children character Mickey Mouse is red. Strawberries, watermelon, cherries, apples, pomegranates are red. There are so many positive reds in the nature, society, technology etc. and they are no way negative. Instead they are pretty positive.
red

Orange, Yellow

The color wheel says orange is nervous, worried, concerned, confused; yellow is afraid, apprehensive… Let’s see what we have and use in orange and yellow and check out emotions. Oranges are no way nervous or concerned, it is common daily juice. Yellow dresses are cool. Orange Lambo is standard color and all good with emotions.
orange_yellow

Violet, Blue

The color wheel says violet is disgust, distate, disappoint; blue is sad, grief. Let’s check it around us. What about violet fairy? Violet lavender fields, violet blossom? Blue is common on concerts, check out pleasant folk music by Blackmore’s Night… Horizon is often blue and there is nothing sad about it.
violet_blue

Conclusion

The perception of colors is different today, seems it was different for years. That emotion color wheel is wrong. There are so many real life cases when vivid violet, orange, blue are used and it looks positive, emotionally positive. Check out cutting-edge wearable products from Jawbone, ensure there is nothing wrong with the colors at all. Emotions are driven by combinations, via balance, via synergy, but never by damn single color.
jawbone

PS.
Don’t know how to add more vertical space before headers. WordPress usability sucks.

Tagged , , , , ,

A Story behind IoE @ THINGS EXPO

This post is related to the published visuals from my Internet of Everything session at THINGS EXPO in June 2014 in New York City. The story is relevant to the visuals but there is no firm affinity to particular imagery. Now there story is more like a stand alone entity.

How many things are connected now?

Guess how many things (devices & humans) are connected to the Internet? Guess who knows? The one who produces those routers, that moves your IP packets across the global web – Cisco. Just navigate to the link http://newsroom.cisco.com/ioe and check the counter in right top corner. The counter doesn’t look beautiful, but it’s live, it works, and I hope will continue to work and report approximate number of the connected things within the Internet. Cisco predicts that by 2020, the Internet of Everything has the potential to connect 50 billion. You could check yourself whether the counter is already tuned to show 50,000,000,000 on 1st of January 2020…

Internet of Everything is next globalization

Old good globalization was already described in The World is Flat. With the rise of smart phones with local sensors (GPS, Bluetooth, Wi-Fi) the flatness of the world has been challenged. Locality unflattened the world. New business models emerged as “a power of local”. The picture got mixed: on one hand we see same burgers,  Coca-Cola and blue jeans everywhere, consumed by many; while on the other hand we already consume services tailored to locality. Even hard goods are tailored to locality, such as cars for Alaska vs. Florida. Furthermore, McDonald’s proposes locally augmented/extended menu, and Coca-Cola wraps the bottles with locally meaningful images.

Location itself is insufficient for the next big shift in biz and lives. A context is a breakthrough personalizer. And that personal experience is achievable via more & smaller electronics, broadband networks without roaming burden, and analytics from Big Data. New globalization is all about personal experience, everywhere for everyone.

Experience economy

Today you have to take your commodities, together with made good, together with services, bring it all onto the stage and stage personal experience for a client. It is called Experience Economy. Nowadays clients/users want experiences. Repeatable experiences like in Starbucks or lobster restaurant or soccer stadium or taxi ride. I already have a post on Transformation of Consumption. Healthcare industry is one of early adopters of the IoT, hence they deserved separate mentioning, there is a post on Next Five Years of Healthcare.

So you have to get prepared for the higher prices… It is a cost of staging of personal experience. Very differentiated offering at premium price. That’s the economical evolution. Just stick to it and think how to fit there with your stuff. Augment business models correspondingly. Allocate hundreds of MB (or soon GB) for user profiles. You will need a lot to store about everybody to be able to personalize.

Remember that it’s not all about consumer. There are many things around consumer. They are part of the context, service, ecosystem. Count on them as well. People use those helper things [machines, software] to improve something in biz process or in life style, either cost or quality or time or emotions. Whatever it is, the interaction between people and between people-machine is crucial for proper abstraction and design for the new economy.

Six Webs by Bill Joy

Creator of Berkeley Unix, creator of vi editor, co-founder of Sun Microsystems, partner at KPCB – Bill Joy – outlined six levels of human-human, human-machine, machine-machine interaction. That was about 20 years ago.

  • Hear – human & intimate device like phone, watch, glasses.
  • Near – human & personal while less intimate device like laptop, car infotainment.
  • Far – human & remote machines like TV panels, projections, kiosks.
  • Weird – human-machine via voice & gesture interfaces.
  • B2B – machine-machine at high level, via apps & services.
  • D2D – machine-machine as device to device, mesh networks.

About 10 years ago Bill Joy reiterated on Six Webs. He pointed to “The Hear Web” as most promising and exciting for innovations.

“The Hear Web” is anatomical graphene

The human body is anatomically the same through the hundreds of years. Hence the ergonomics of wearables and handhelds is predefined. Braceletswristwatchesarmbands are those gadgets that could we wear for now on our arms. The difference is in technology. Earlier it was mechanical, now it is electrical.

Vitruvian

We are still not there with human augmentation to speak about underskin chips… but that stuff is being tested already… on dogs & cats. Microchip with information about rabies vaccination is put under the skin. Humans also pioneer some things, but it is still not mainstream to talk much about.

For sure “The Hear Web” was a breakthrough during recent years. The evolution of smartphones was amazing. The emergence of wrist-sized gadgets was pleasant. We are still to get clarity what will happen with glasses. Google experiments a lot, but there is a long way to go until the gadget is polished. That’s why Google experiments with contact lenses. Because GLASS still looks awkward…

The brick design of touch smartphone is not the true final one. I’ve figured out significant issue with iPhone design. LG Flex is experimenting with bendable, but that’s probably not significantly better option. High hopes are on Graphene.    Nokia sold it’s plastic phone business to Microsoft, because Nokia got multi-billion grant to research in graphene wearables. Graphene is good for electricity, highly durable, flexible, transparent. It is much better for the new generation of anatomically friendly wearables.

BTW there will be probably no space for the full-blown HTML5/CSS3/JavaScript browsers on those new gadgets. Hence forget about SaaS and think about S+S. Or client-server which is called client-cloud nowadays. Programming language could be JavaScript, as it works on small hardware already, without fat browsers running spreadsheets. Check out Tessel. The pathway from current medium between gadgets & clouds is: smartphone –> raspberry pi –> arduino.

D2D

D2D stands for Device-to-Device. There must be standards. High hopes are on Qualcomm. They are respected chipset & patents maker. They propose AllJoyn – open source approach for connecting things – during recent years. All common functionality such as discovery/onboarding, Wi-Fi comms, data streaming to be standardized and adopted by developers community.

AllSeen Alliance is an organization of supporters of the open source initiative for IoT. It is good to see there names like LG, Sharp, Haier, Panasonic, Technicolor (Thomson) as premier members, Cisco, Symantec and HTC as community members. And really nice to see one of etalons of Wikinomics – Local Motors!

For sure Google would try to push Android onto as many devices as possible, but Google must understand that they are players in plastic gadgets. It’s better to invest money into hw & graphene companies and support the alliance with money and authority. IoT needs standards, especially at D2D/M2M level.

How to design for new webs?

If you know how – then go ahead. Else – design for personal experience. Internet of Everything includes semiconductors, telecoms and analytics from Big Data.

IoT

 

Assuming you are in software business, let semiconductors continue with Moore’s Law, let telecoms continue with Metcalfe’s Law, while concentrate on Big Data to unlock analytics potential, for context handling, for staging personal experience. Just consider that Metcalfe’s Law could be spread onto human users and machines/devices.

Start design of Six Graphs of Big Data from Five Sources of Big Data. The relation between graphs and sources is many-to-many. Blending of the graphs is not trivial. Look into Big Data Graphs Revisited. Conceptualization of the analytics pipeline is available in Advanced Analytics, Part I. Most interesting graphs are Intention & Consumption, because first is a plan, second is a fact. When they begin to match, then your solution begin to rock. Write down and follow it – the data is the next currency. 23andme and Uber logs so much data besides the cap of service you see and consume…

Where we are today?

There are clear five waves of the IoT. Some of those waves overlap. Especially ubiquitous identification of people or things indoors and outdoors. If the objects is big enough to be labeled with RFID tag or visual barcode than it is easy. But small objects are not labeled neither with radio chip nor with optical code. No radio chip because it is not good money-wise. E.g. bottles/cans of beer are not labeled because it’s too expensive per item. The pallets of beer bottles are labeled for sure, while single bottle is not. There is no optical code as well, to not spoil the design/brand of the label. Hence it is a problem to look for alternative identification – optical – via image recognition.

Third wave includes image recognition, which is not new, but it is still tough today. Google has trained Street View brain to recognize house numbers and car plates at such high level of accuracy, that they could crack captcha now. But you are not Google and you will get 75-78% with OpenCV (properly tuned) and 79-80% with deep neural networks (if trained properly). The training set for deep learning is a PITA. You will need to go to each store & kiosk and make pictures of the beer bottles under different light conditions, rotations, distances etc. Some footage could be made in the lab (like Amazon shoots the products from 360) but plenty of work is your own.

 

FiveWavesIoT

Fourth wave is about total digitization of the world, then newer world will work with digital things vial telepresence & teleoperations. Hopefully we will dispense with all those power adapters and wires by that time. “Software is eating the World”. All companies become software companies. Probably you are comfortable with digital music (both consuming and authoring), digital publishing, digital photos and digital movies. But you could have concerns with digital goods, when you pay for the 3D model and print on 3D printer. While atomic structure of the printed goods is different, your concern is right, but as soon as atomic structure is identical [or even better] then old original good has, then your concern is useless. Read more in Transformation of Consumption.

With 3D printing of hard goods it’s less or more understandable. Let’s switch tp 3D printed food. Modern Meadow printed a burger year ago. It costed $300K, approximately as much as Sergei Brin (Googler) invested into Modern Meadow. Surprised? Think about printed newest or personal vaccines and so forth…

Who is IoT? Who isn’t?

Is Uber IoT or not? With human drivers it is not. When human-driven cabs are substituted by self-driving cabs, then Uber will become an IoT. There is excellent post by Tim O’Reilly about Internet of Things & Humans. CEO of Box.com Levie tweeted “Uber is a $3.5 billion lesson in building for how the world *should* work instead of optimizing for how the world *does* work.” IoT is not just more data [though RedHat said it is], IoT is how this world should work.

How to become IoT?

  • Yesterday it was from sensors + networks + actuators.
  • Today it becomes sensors + networks + actuators + cloud + local + UX.
  • Tomorrow it should be sensors + networks + actuators + cloud + local + social + interest + intention + consumption + experience + intelligence.

Next 3000 days of the Web

It was vision for 5000 days, but today only 3000 days left. Check it out.

Next 6000 days of the Web

Check out There will be no End of the World. We will build so big and smart web, that we as humans will prepare the world to the phase shift. Our minds are limited. Our bodies are weird. They survive in very narrow temperature range. They afraid of radiation, gravity. We will not be able to go into deep space, to continue reverse engineering of this World. But we are capable to create the foundation for smarter intelligence, who could get there and figure it out. Probably we would even don’t grasp what it was… But today IoT pathway brings better experiences, more value, more money and more emotions.

PS.

Let’s check Cisco internet of things counter. ~300,000 new things have connected to the Internet while I wrote this story.

Tagged , , , , , , , , , , ,

Big Data Graphs Revisited

Some time ago I’ve outlined Six Graphs of Big Data as a pathway to the individual user experience. Then I’ve did the same for Five Sources of Big Data. But what’s between them remained untold. Today I am going to give my vision how different data sources allow to build different data graphs. To make it less dependent on those older posts, let’s start from the real-life situation, business needs, then bind to data streams and data graphs.

 

Context is a King

Same data in different contexts has different value. When you are late to the flight, and you got message your flight was delayed, then it is valuable. In comparison to receiving same message two days ahead, when you are not late at all. Such message might be useless if you are not traveling, but airline company has your contacts and sends such message on the flight you don’t care about. There was only one dimension – time to flight. That was friendly description of the context, to warm you up.

Some professional contexts are difficult to grasp by the unprepared. Let’s take situation from the office of some corporation. Some department manager intensified his email communication with CFO, started to use a phone more frequently (also calling CFO, and other department managers), went to CFO office multiple times, skipped few lunches during a day, remained at work till 10PM several days. Here we got multiple dimensions (five), which could be analyzed together to define the context. Most probably that department manager and CFO were doing some budgeting: planning or analysis/reporting. Knowing that, it is possible to build and deliver individual prescriptive analytics to the department manager, focused and helping to handle budget. Even if that department has other escalated issues, such as release schedule or so. But severity of the budgeting is much higher right away, hence the context belongs to the budgeting for now.

By having data streams for each dimension we are capable to build run-time individual/personal context. Data streams for that department manager were kind of time series, events with attributes. Email is a dimension we are tracking; peers, timestamps, type of the letter, size of the letter, types and number of attachments are attributes. Phone is a dimension; names, times, durations, number of people etc. are attributes. Location is a dimension; own office, CFO’s office, lunch place, timestamps, durations, sequence are attributes. And so on. We defined potentially useful data streams. It is possible to build an exclusive context out of them, from their dynamics and patterns. That was more complicated description of the context.

 

Interpreting Context

Well, well, but how to interpret those data streams, how to interpret the context? What we have: multiple data streams. What we need: identify the run-time context. So, the pipeline is straightforward.

First, we have to log the Data, from each interested dimension. It could be done via software or hardware sensors. Software sensors are usually plugins, but could be more sophisticated, such as object recognition from surveillance cameras. Hardware sensors are GPS, Wi-Fi, turnstiles. There could be combinations, like check-in somewhere. So, think that it could be done a lot with software sensors. For the department manager case, it’s plugin to Exchange Server or Outlook to listen to emails, plugin to ATS to listen to the phone calls and so on.

Second, it’s time for low-level analysis of the data. It’s Statistics, then Data Science. Brute force to ensure what is credible or not, then looking for the emerging patterns. Bottleneck with Data Science is a human factor. Somebody has to look at the patterns to decrease false positives or false negatives. This step is more about discovery, probing and trying to prepare foundation to more intelligent next step. More or less everything clear with this step. Businesses already started to bring up their data science teams, but they still don’t have enough data for the science:)

Third, it’s Data Intelligence. As MS said some time ago “Data Intelligence is creating the path from data to information to knowledge”. This should be described in more details, to avoid ambiguity. From Technopedia: “Data intelligence is the analysis of various forms of data in such a way that it can be used by companies to expand their services or investments. Data intelligence can also refer to companies’ use of internal data to analyze their own operations or workforce to make better decisions in the future. Business performance, data mining, online analytics, and event processing are all types of data that companies gather and use for data intelligence purposes.” Some data models need to be designed, calibrated and used at this level. Those models should work almost in real-time.

Fourth, is Business Intelligence. Probably the first step familiar to the reader:) But we look further here: past data and real-time data meet together. Past data is individual for business entity. Real-time data is individual for the person. Of course there could be something in the middle. Go find comparison between stats, data science, business intelligence.

Fifth, finally it is Analytics. Here we are within individual context for the person. There worth to be a snapshot of ‘AS-IS’ and recommendations of ‘TODO’, if the individual wants, there should be reasoning ‘WHY’ and ‘HOW’. I have described it in details in previous posts. Final destination is the individual context. I’ve described it in the series of Advanced Analytics posts, link for Part I.

Data Streams

Data streams come from data sources. Same source could produce multiple streams. Some ideas below, the list is unordered. Remember that special Data Intelligence must be put on top of the data from those streams.

In-door positioning via Wi-Fi hotspots contributing to mobile/mobility/motion data stream. Where the person spent most time (at working place, in meeting rooms, on the kitchen, in the smoking room), when the person changed location frequently, directions, durations and sequence etc.

Corporate communication via email, phone, chat, meeting rooms, peer to peer, source control, process tools, productivity tools. It all makes sense for analysis, e.g. because at the time of release there should be no creation of new user stories. Or the volumes and frequency of check-ins to source control…

Biometric wearable gadgets like BodyMedia to log intensity of mental (or physical) work. If there is low calories burn during long bad meetings, then that could be revealed. If there is not enough physical workload, then for the sake of better emotional productivity, it could be suggested to take a walk.

 

Data Graphs from Data Streams

Ok, but how to build something tangible from all those data streams? The relation between Data Graphs and Data Streams is many to many. Look, it is possible to build Mobile Graph from the very different data sources, such as face recognition from the camera, authentication at the access point, IP address, GPS, Wi-Fi, Bluetooth, check-in, post etc. Hence when designing the data streams for some graph, you should think about one to many relations. One graph can use multiple data streams from corresponding data sources.

To bring more clarity into relations between graphs and streams, here is another example: Intention Graph. How could we build Intention Graph? The intentions of somebody could be totally different in different contexts. Is it week day or weekend? Is person static in the office or driving the car? Who are those peers that the person communicates a lot recently? What is a type of communication? What is a time of the day? What are person’s interests? What were previous intentions? As you see there could be data logged from machines, devices, comms, people, profiles etc. As a result we will build the Intention Graph and will be able to predict or prescribe what to do next.

 

Context from Data Graphs

Finally, having multiple data graphs we could work on the individual context, personal UX. Technically, it is hardly possible to deal with all those graphs easily. It’s not possible to overlay two graphs. It is called modality (as one PhD taught me). Hence you must split and work with single modality. Select which graph is most important for your needs, use it as skeleton. Convert relations from other graphs into other things, which you could apply to the primary graph. Build intelligence model for single modality graph with plenty of attributes from other graphs. Obtain personal/individual UX at the end.

Tagged , , , , , , , , , , , , , , , , , , , , , ,

On the Internet of Everything

Five Waves of the “Internet of Things” on its Way of Transforming into “Internet of Everything”

http://united.softserveinc.com/blogs/software-engineering/may-2014/internet-of-things-transforming/

Tagged , , , , , , ,

Advanced Analytics, Part V

This post is related to the details of visualization of information for executives and operational managers on the mobile front-end. What is descriptive, what is predictive, what is prescriptive, how it looks like, and why. The scope of this post is a cap of the information pyramid. Even if I start about smth detailed I still remain at the very top, at the level of most important information without details on the underlying data. Previous posts contains introduction (Part I) and pathway (Part II) of the information to the user, especially executives.

Perception pipeline

The user’s perception pipeline is: RECOGNITION –> QUALIFICATION –> QUANTIFICATION –> OPTIMIZATION. During recognition the user just grasps the entire thing, starts to take it as a whole, in the ideal we should deliver personal experience, hence information will be valuable but probably delivered slightly different from the previous context. More on personal experience  in next chapter below. So as soon as user grasps/recognizes she is capable to classify or qualify by commonality. User operates with categories and scores within those categories. The scores are qualitative and very friendly for understanding, such as poor, so-so, good, great. Then user is ready to reduce subjectivity and turn to the numeric measurements/scoring. It’s quantification, converting good & great into numbers (absolute and relative). As soon as user all set with numeric measurements, she is capable to improve/optimize the biz or process or whatever the subject is.

Default screen

What should be rendered on the default screen? I bet it is combination of the descriptive, predictive and prescriptive, with large portion of space dedicated to descriptive. Why descriptive is so important? Because until we build AI the trust and confidence to those computer generated suggestions is not at the level. That’s why we have to show ‘AS IS’ picture, to deliver how everything works and what happens without any decorations or translations. If we deliver such snapshot of the business/process/unit/etc. the issue of trust between human and machine might be resolved. We used to believe that machines are pretty good at tracking tons of measurements, so let them track it and visualize it.

There must be an attempt from the machines to try to advice the human user. It’s could be done in the form of the personalized sentence, on the same screen, along with descriptive analytics. So putting some TODOs are absolutely OK. While believing that user will trust them and follow them is naive. The user will definitely dig into the details why such prescription is proposed. It’s normal that user is curious on root-cause chain. Hence be ready to provide almost the same information with additional details on the reasons/roots, trends/predictions, classifications & patterns recognition within KPI control charts, and additional details on prescriptions. If we visualize [on top of the inverted pyramid] with text message and stack of vital signs, then we have to prepare additional screen to answer that list of mentioned details. We will still remain on top of the pyramid.

default_screen

 

Next screen

If we got ‘AS IS’ then there must be ‘TO BE’, at least for the symmetry:) User starts on default screen (recognition and qualification) and continues to the next screen (qualification and quantification). Next screen should have more details. What kind of information would be logically relevant for the user who got default screen and looks for more? Or it’s better to say – looks for ‘why’? May be it’s time to list them as bullets for more clarity:

  • dynamic pattern recognition (with highlight on the corresponding chart or charts) what is going on; it could be one from seven performance signals, it should be three essential signals
  • highlight the area of the significant event [dynamic pattern/signal] to the other charts to juxtapose what is going on there, to foster thinking on potential relations; it’s still human who thinks, while machine assists
  • parameters & decorations for the same control charts, such as min/max/avg values, identifications of the quarters or months or sprints or weeks or so
  • normal range (also applicable to the default screen) or even ranges, because they could be different for different quarters or years
  • trend line, using most applicable method for approximation/prediction of future values; e.g. release forecast
  • its parts should be clickable for digging from relative values/charts into the absolute values/charts for even more detailed analysis; from qualitative to quantitative
  • your ideas here

signal

Recognition of signals as dynamic patterns is identification of the roots/reasons for smth. Predictions and prescriptions could be driven by those signals. Prescriptions could be generic, but it’s better to make personalized prescriptions. Explanations could be designed for the personal needs/perception/experience.

 

Personal experience

We consume information in various contexts. If it is release of the project or product then the context is different from the start of the zero sprint. If it’s merger & acquisition then expected information is different from the quarterly review. It all depends on the user (from CEO to CxOs to VPs to middle management to team management and leadership), on the activity, on the device (Moto 360 or iPhone or iPad or car or TV or laptop). It matters where the user is physically, location does matter. Empathy does matter. But how to reach it?

We could build users interests from social networks and from the interaction with other services. Interests are relatively static in time. It is possible to figure out intentions. Intentions are dynamic and useful only when they are hot. Business intentions are observable from business comms. We could sense the intensity of communication between the user and CFO and classify it as a context related to the budgeting or budget review. If we use sensors on corporate mail system (or mail clients), combine with GPS or Wi-Fi location sensors/services, or with manual check-in somewhere, we could figure out that the user indeed intensified comms with CFO and worked together face-to-face. Having such dynamic context we are capable to deliver the information in that context.

The concept of personal experience (or personal UX) is similar to the braid (type of hairstyle). Each graph of data helps to filter relevant data. Together those graphs allows to locate the real-time context. Having such personal context we could build and deliver most valuable information to the user. More details how to handle interest graph, intention graph, mobile graph, social graph and which sensors could bring the modern new data available in my older posts. So far I propose to personalize the text message for default screen and next screen, because it’s easier than vital signs, and it’s fittable into wrist sized gadgets like Moto 360.

Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Advanced Analytics, Part IV

This post is a next on in the series on Advanced Analytics. Check out previous introduction, ruminations on conveying information, modern concepts on information and data for executives.

Dashboard or not Dashboard?

There is nothing wrong about dashboard, except it’s puts a stereotype on your consciousness and that defines your further expectations. In general dashboards are good, check out this definition from Wikipedia: “An easy to read, often single page, real-time user interface, showing a graphical presentation of the current status (snapshot) and historical trends of an organization’s key performance indicators (KPIs) to enable instantaneous and informed decisions to be made at a glance.”

Easy to read sounds exciting, at last executive or operating information will be friendly and easy to read! Single page is resonating with my description of A4 or letter size [What to communicate?] due to anthropological sizes of our bodies and body parts, and the context of information consumption, usability. Real-time user interface could be even improved with real-time or near-real-time data delivery to the user. Visualization as a graphical reps for the current status – snapshot – and dynamics/trend of the indicator is resonating with ‘All Data’. Enablement of instantaneous and informed decisions is resonating with Vital Signs.

So the conclusion is to dashboard. The open question is how to dashboard? Is there are standard for dashboard? What are the best practices? What are known pitfalls? How modern dashboards will look like? Let’s start with what are known problems, so that we know what ti overcome to make dashboarding more usable and valuable.

Gauges suck!

Dashboard gauges and dials do suck. It is wrong way to visualize information. What is a gauge? From Wikipedia: “In engineering, a gauge is a device used to make measurements or in order to display certain information, like time.” Primary task of the gauge is to perform measurement, secondary task is to display it. Furthermore, the gauge must use special principle of measurement, because same things could be measured via multiple different principles (e.g. temperature could be measured with mercury thermometer, infrared radiation, resistance thermometer and other ways). What “dashboard gauges” do? Gauges do not measure anything at all, while they do display almost everything. That’s a root of the problem. To be specific the problem lays in the ill application of the principle of analogy [skeumorphism].

What are other problems of the dials/gauges?

  • They “say little and do so poorly”. By Stephen Few.
  • They might look cute, but like empty calories they give little information for the amount of space they consume. By Aeternus Consulting.
  • Retro design is crippling innovation. By Wired. Skeuomorphs aren’t always bad; the Kindle is easy to use precisely because it behaves so much like a traditional print book.
  • Do you know how much research went into determining that idiot lights and gauges that look just like those in our cars are the best way to display information on a dashboard for monitoring your organization’s performance? The answer is zilch; none whatsoever. Back in the beginning when we started calling computer-based monitoring displays dashboards, someone had the bright idea of making display widgets that looked like those in cars. This is an example of taking a metaphor too literally. By Stephen Few, Perceptual Edge. Hence don’t be fooled by the illusion of control instead of real control.
  • And several more arguments of why dashboard dials and gauges are useless for KPIs. I will devote entire next section to those details. Keep reading.

Gauges are bad for KPIs

This section extends and continues the previous one, with more dedication to the visualization of KPIs. What are KPIs? From Wikipedia: “A key performance indicator (aka KPI) is a type of performance measurement. An organization may use KPIs to evaluate its success, or to evaluate the success of a particular activity in which it is engaged. Sometimes success is defined in terms of making progress toward strategic goals, but often success is simply the repeated, periodic achievement of some level of operational goal (e.g. zero defects, 10/10 customer satisfaction, etc.)” Just read aloud and listen to your words – activity, performance, progress, repeated, periodic. All those words mean duration in time. But what we have on the gauge? Nothing. The gauge clips and ignores everything except current value. That’s poor. This and other problems are listed below, they are partially reused for your convenience from Stacey Barr blog:

  • The purpose of performance measures is to track change toward a target through time. Performance doesn’t improve immediately – you need to allow time to change your processes so they become capable of operating at that targeted level. Performance measurement involves monitoring change over time, and looking for signals about whether it’s moving close enough and fast enough toward the target.
  • Dials and gauges don’t show change over time at all. You are flying blind. You need this [dynamic] context in your performance measures to help you priorities. Because Dials and gauges don’t use this context, they are also incapable of showing you true signals in your measures.
  • Dials and gauges highlight false signals. Dials and gauges have you knee-jerk reacting to routine variation. Check out Stacey Barr post on routine variation and other stats tips for KPIs.
  • There is a better way to show performance measures on dashboards than dials or gauges. We can provide historical context and valid rules for signals of change. Check out smartlines. You will be surprised by seeing there names of Tufte, Few and sparklines. Besides that there are other ideas.

Criteria for KPI visualization

There is a list of criteria for proper tracking, analysis and visualziation of KPIs. Having understood them it will be obvious why gauges and dials should be put into archive as weak use of skeumorphism. Proper approach would be capable to convey both detection and representation on UI of this list:

  • Chaos in performance. First as deviation from predictability, then true chaos.
  • Worsening performance. E.g. degrading productivity or quality or value.
  • Flat plateau. Everything stable and not changing, while change towards growing revenue or growing happiness expected.
  • Wrong pace. We are improving but not fast enough. The target remains out of reach.
  • Right pace. We will reach the strategic target in time.
  • We are there. We have reached the target already.
  • We exceeded expectations. The target is exceeded.

Check out for more comments and details on “The 7 Performance Signals to Look For in Your KPIs” by Stacey Barr.

Conclusion

The concept of dashboard is up to date, powerful and suitable for modernization. The previous posts confirm that with majority of arguments. But the use of dials or gauges is not right design solution for visualization on dashboard. Line charts, control charts, ‘All Data’ charts, smartlines, sparklines, logarithmic charts and other types of graphical representations are still elegant and powerful to conform to seven criteria for KPI visualization. On the other hand they conform to the high-level executive friendly information (see section What exactly to communicate).

Tagged , , , , , , , , , , , , , , , , , , , , ,

Advanced Analytics, Part III

This post is also about the front-end part, as a conduit for information delivery to the decision maker. Previous two posts are available, it’s recommended to check out the Introduction into the Big Picture and Ruminations on Conveying, Organization and Segmentation of Information for Executives as users.

Big Data? All Data!

It’s time to pay attention to all data available. Personally I see no reasons to limit to big data. All data matters, most recent data matters more, oldest data matters less. It is possible to visualize plenty of data on relatively small space, which is convenient for delivery onto smartphones and wrist-sized gadgets. The rationale is to depict firm details on the most recent/relevant data, the relevancy is determined by the adopted processes. In SDLC it could be a sprint or iteration; in healthcare it could be a period since current admission. The latest measured value matters a lot, hence must be clearly distinguished on top of the other values within the period. The dynamics during the period also matters, hence should be visualized to convey the dynamics.

Previous periods/cycles do matter, especially for comparison and benchmarking to enable better strategic planning. The firm details on dynamics during past cycles are not so valuable, while deviations into both positive and negative directions are very informative. Decision maker knows how to classify the current cycle exceptions, whether something brand new happened or whether business experienced even more severe deviations in the past, and recall how.

Being inspired some time ago by medical patient summaries by Tufte and Powsner I’ve tried to generalize the concept to be applicable to other non-healthcare industries. So far it fits perfectly, allows customization and flexibility, especially for the optimization of the processes, where people usually use control charts on dashboards. Below is a generalized version of the ‘All Data’ chart as a concept.

all_data

Inverted Pyramid

The principle of Inverted pyramid is partially present there, the pyramid is rotated by 90 degrees. Most important information is within the biggest part of the chart, in the center and on the right. It is rather information than data, because id conveys latest value, dynamics during recent cycle, benchmarking against the normal range, indication of deviations (in qualitative way, using only two categories: somewhat and significant). It’s rationale to stay in the range of 10 with the measurements so that they are remember-able relatively easy.

The next narrow part to the left from the sparkline is partially information and partially data. It’s used for comparison and benchmarking, analysis of exceptions, retrospective analysis. It is absolutely logical to fit there 10 times more data, so that if there is a lack of information in the biggest part, the user is able to dig deeper and obtain significantly more facts and reasons, as measurements of the same thing. Hence phase shift means at least 10x growths. With medical patient summaries the ratio was similar: one-two months between admission and discharge vs. one previous year. But 10x is not a hard ratio, it’s more indicative that we need a kind of phase shift to different data, different level of abstraction.

The leftmost narrow part is actually the all and oldest data. It is additional phase shift, relatively to the middle part, hence imagine additional 10x increase and digging to the different level of abstraction again. Only exceptions marked as min/max are comparable between all parts. Everything else constitutes the inverted pyramid of making the information out of raw data.

Cap of the pyramid: Vital Signs

I think the cap of the information pyramid requires special conceptualization. ‘All Data’ is attractive tool to deliver project/process vital signs for executives and other managers (decision makers), they could be compressed even more. Furthermore, the top five-seven measurements could be stacked and consumed all together. That increases the value of the information synergistically, because some indicators are naturally perceived together as juxtaposition of what is going on.

Specific vital signs for business performance and SDLC process optimization were listed in details in my previous post Advanced Analytics, Part II. Here I will only mention them for your convenience: productivity, predictability, value and value-add, innovation in core competency, human factor and emotional intelligence/engagement. Those are ‘the must’ for executives. They could be stacked as vital signs and consumed as integral big picture.

vistal_signs

Of course we can introduce normal range there, ticks for the time tracking, highlight min/max… The drawing represents the idea of stacking and consumption of executive information of SDLC project/process performance in modern manner. You could critisize or improve it, I’ll be thankful for feedback.

There are two dozens of lower level operational indicators and measurements. Some of them could be naturally conveyed via ‘All Data’ concept, others require other concepts. I am going to address them in next posts. Stay tuned.

Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,