Quantum Hello World

What?

So What?

10,000 times more productive than Xeons at same power consumption levels. With potential to 50,000x.

Explanation on Cats

Schrödinger’s cat lives. Quantum is not micro anymore, it is macro, it is everything.

 

Tagged , , ,

We will fuck robots

We already love machines

If you have a car, and you don’t use any name to talk to you car, then there are two reasons: either your car is just another soul-less car, or your are special. Usually every car owner loves her/his car and uses some name for it. If you car is under-powered, you would ask her: “come on, come on car_name”, when climbing the ascending road. So you are already talking to the machines, at least to one machine – your car.
The same it true for the boat, small air boat or bigger luxurious one. Obviously it is true for the bikes. Maybe now it is emerging for the flying drones. Interesting how employees who still work at warehouses call those machines there… Probably they don’t love them. But the fact is that we love our lifestyle machines: cars, bikes, boats, drones.

Exponential technologies

There is technological acceleration nowadays. It is most visible on 8 technologies (check Abundance and Singularity University for deeper details):
* biotech and bioinformatics
* computational systems
* networks and sensors
* artificial intelligence
* robotics
* digital manufacturing
* medicine
* nanotech and nanomaterials

It is very interesting to dig into each of them, or combine them. But it is not a purpose on this post. There are two others, or just Big Two, as an umbrella on those eight: military and porn. Many researches are applied in military first, then go to other industries and lifestyle. Many technological problems are solved in porn, and many opportunities are created there.

Connecting the dots

Materials are needed to reach realistic experience, to transcend from dolls to human peers. New Turing test – you smell and touch the skin, hair, and you cannot distinguish between natural/organic and artificial. Maybe we could program existing cells to grow and behave slightly different, or we will invent new synthetic materials that will be indistinguishable from organic. Or hybrids why not? Probably new materials will self assemble or be manufactured at atomic level by new 3D printers, to connect the right atoms in to right atomic grids. A lot depends on the connection order: diamonds and ashes are from same Carbon atoms. The bottom line is that biotech and nanomaterials and 3D printing are well empowering creation of cyberskin with realistic experience.

Computational systems, robotics and artificial intelligence is what’s behind the scenes [read behind the skin]. Having joints is not enough. All those artificial bones and ligaments must be orchestrated perfectly. It is all about real-time computing. Energy efficient real-time computing, to avoid any sticking wires externally, or plenty of heat or noise released. Everything should be smart enough to be packed into the known volume, to have known weight, center of mass, have known temperature, and bend and move realistically. Emotional intelligence is important, to have adequate mental reaction between human and machine.

Rudimentary evidence

Sex dolls, dildos and travel pussies exist for years. Porn stars have official copies of their realistic genitals and dolls sold. Porn stars might be OK that somebody fucks their rubber shadows. What cannot be said by other celebrities, because of privacy, ethics, moral. There is some real knowledge of beauty. Female face research, published in Brain Research, as hypothalamus reaction. Something similar should be available for male faces. And not only faces, for the body, for the voice, for the smell, manners etc. All that could be grasped by measurements and machine learning. At the end we need some classification like those beautiful faces, to know what exists, and then figure out how to use them.

face_research_WP.jpg

 

Breakthrough

There will be just better sex dolls, indistinguishable from people. Turing sex test will be passed between the legs. Caleb could try it with Ava, and he wanted to, he truly fell in love with Ava [Ex Machina]. That’s just a question of time. What is interesting, how we are going to control/prevent emergence of the copies of ourselves. OK, may be there is no big demand for copies of you, but there will be big demand for copies of celebrities. And celebrities may not be happy that somebody fucks their realistic clones. That will go underground, and will grow behind the law. Probably there will be countries or territories world wide capitalizing on this sex heavens, like some did as tax heavens. We will have sexual intercourse with robots like Deckard had with Rachael [Blade Runner, Los Angeles 2019], full of emotions and feelings for both sides. And enough people will do sex tourism for the forbidden fruit – to fuck their favorite peers and celebrities. Maybe there will be on-demand manufacturing of the sex clone of everybody via pictures & videos from their social traces… This is how ultimate experience in porn will evolve. People will fuck their lovely robots.

Tagged , , , , , ,

On Go

Some time ago noticed somebody was building #golang complaints list. On Monday noticed this tweet by @golangweekly. Now decided to address top five complaints, taken from How to complain about Go. So here are those top five:

  1. error handling / no exceptions
  2. no generics
  3. stuck in 70’s
  4. no OOP
  5. too opinionated

Below they are addressed in the numbered sections, one complaint per section.

1. error handling / no exceptions

Majority of programmers are lazy. Especially programmers of business logic. Commodity programmers constitute 90-95% of all programmers base. The rest 5-10% are master programmers. Usually commodity programmers write simple business logic, translating requirements into the machine program, nothing mission critical, usually not challenging, and they are getting lazy. Some commodity programmers become very lazy. It is normal for them to code only positive path and totally ignore the negative. That is not good, in any programming language. Just some languages do tolerate the laziness, and allow to avoid explicit coding of the errors.

Go does not tolerate laziness. If you wrote “if”, please write “else”. Within “else” write retry logic, or switch to default parameters, or try to recover, or stop explicitly. If something can fail, then check for success or failure before moving forward. In go you still could skip the negative path, but you are enforced to code the positive one.

Errors are part of the code. Because of binary nature of the information, errors are part of the flow. So it’s better to program both sides, good & bad, light & dark. It’s absolutely good that go nurtures to code errors as organic pieces of your entire program. Isn’t “if err != nil” annoying? Yes, it is. It is part of technology, you just do it to use the technology, you adapt to the technology. Skipping errors, or obscuring errors is bigger evil than having some idiomatic annoyance.

Exceptions failed [in other programming languages]. Exceptions were invented to handle really exceptional situations. Majority of C++ coders did not ever read “Design and Evolution of C++” to understand why language features were introduced for solving real problems with C++ language. So exceptions failed in C++ during its best times. Java and .NET coders went further from exceptions as really exceptional situations, and exceptions became error codes and events. Having exceptions as events is the worst case, because they were not designed for that purpose. The laziness of commodity programmers reached pinnacle with catch(…){} and catch (Exception e){} and except: and so on. That sucks.

As go is not a programming language for all purposes, it was designed to rewrite old C++ programs, and to do new system programming, there is no need to listen to those complains by commodity programmers. It is important to not nurture laziness. It it good to enforce checking for errors, and coding of both “if” and “else”. There is no other way to go, the machine will not create the missing logic for negative path for you. That code must be done by the programmer.

2. no generics

How generics appeared in other programming languages? Let’s take C++ again. Before generics, coders used preprocessor directives to tweak and glue the words, generate the program code, so that same algorithm/flow could work with different types. That glueing was done only by families of the types, e.g. glueing for all numeric types (small integers, signed integers, unsigned integers, long integers, …, same for floats etc.) There is common sense in that, because numeric types all together is a cluster vs. other clusters of types. But how it unfolded? You could parameterize everything by everything, even declaring type on the left and passing it as parameter on the right: class MainWnd : public CWindowImpl<MainWnd>

The pinnacle of generic design and generic programming could be considered typelists. Here is the source code typelist.h allowing deep nesting of templates. AFAIR initial implementation allowed 50 levels of nesting. The wisdom emerged, typelists’ author Andrei Alexandrescu slams generic programming nowadays.

So it’s OK that generic paradigm was fashionable, promising in the past. It’s also OK that it did not solve the problems. It’s better to apply commonality and variability analysis during your abstraction exercises and use interfaces for the coding of commonality. Go goes that way, which is good. Go lacks clustering by type families, it’s not so bad, just think again which type do you really need and use only that type. It is not a problem of go, it is your problem [laziness] during program design.

3. stuck in 70s

Java was released in 1995, far after 70s. And we have idiomatic (or idiotic?) public static void main. Is it improvement since 70s? Or furthermore, each program is a class. OMG, my single script is a class? My simple dumb print is a class? Was that escape from 70s? I prefer to stay with old good ANSI C simplicity and clarity, rather than consuming such escape from “being stuck”.

For Java lovers… Java is ideal for code generation. If Java is done by machines, it is perfect. Usually it took place. Enterprise suites with visual BPM do generate Java code. Business analysts drag’n’drop, modeling the real life process, with all tiny details of types mapping, events, conditions, data transformations, transactions etc. And the suite makes the Java code, and runs in on middleware (usually ESB). For machines all that formality and strictness with public static void main is OK. But not for the human.

Why we got so many Java programmers then? Why machines did not produce all Java code? Two reasons. First: not many solutions were designed in BPM, or at least by making some domain specific language (DSL) and design solution with it. So coders took wrong tools and produced imperfect solutions. Two: not everything is possible in WYSIWYG tools. Some manual polishing is needed, and humans filled the gap left by machines. As machines generated Java code, humans had to complement with same Java, to fill in the blanks. Now some Java legacy spread to Android…

Ok, back to go. Go has been designed to mainly overcome C++ problems. For modern system programming. What is modern system programming? It’s Docker for example, not to mention internal Google stuff. C++ was based on C, and C is rooted since 70s. So everything OK with similarity. I do not agree that we stuck, and I do not agree that 70s were bad at all. We had Concorde, introduced in 1976. We were accelerating. While today there is a feel of deceleration. Go dropped modern overengineering pseudo features, and it’s not to go into the past, it’s to go into the future.

4. no OOP

This is most pleasant complain. Commodity programmers usually don’t know OOD and OOP. Hundreds of times I’ve asked the same question on interviews, and hundred times they failed. The question was simple, dual. First: which programming design paradigms you know? Usually OOP was mentioned. Second: what is OOP, main principles? Almost everybody answered “encapsulation, inheritance and polymorphism”. After that the interview finished shortly. Few answered “abstraction” and then those three others.

So, abstraction. Abstraction is the foundation to any programming paradigm, especially to OOD/OOP. How you abstract the real world into the machine world, that could be emulated and run by the machine. Master programmers know that sometimes it’s better to abstract into function, or families of functions; sometimes into data; sometimes into compilation modules; files; namespaces; structures. It is context dependent.

What we got with OOP in the programming languages, which are loved by complainers? Abstraction into classes is prevailing, though other abstraction mechanisms are supported. And with abstraction into the classes we have one God aka Object. It means that entire abstraction of the real life case that we want to emulate and run on the machine is purely hierarchical? That’s not relevant mapping. Because the real world in not hierarchical.

There are hierarchies, at some abstraction levels. But if you look one level up, or one level down, your hirerarchy will be gone. You will see the mesh, grid, web. Then step again one level, and you will spot hierarchy again. Go further, and you will get the mesh. Think of the orders of magnitude in both directions, from this to bigger, and from this to smaller. It’s all about abstraction levels. In the software solution, it’s normal that multiple abstraction levels present, corresponding to the reality, but it’s not normal that only one method is used to deal with them all. OOD is OK only for the context where you deal with true realistic hierarchy. Outside of it, OOD is not the best choice.

Regarding abstraction, go language is absolutely normal. You are not imprisoned into the Object pseudo hierarchy. You could abstract into multiple relevant language tools, depending on the context. I don’t even want to move to secondary goodies (encapsulation & Co, they are OK there too). It’s important that the primary one – abstraction – has been fixed.

For technology creation go is good. Just don’t code business logic in go. For biz logic please design (or select) a DSL/DSEL, and build the logic at higher abstraction level, 3.5/4GL or visually or hybrid. Good book how to think wider “Multi-Paradigm Design for C++”, not tightly bound to C++, very useful for other programming tools.

5. too opinionated

So what? Is it bad? Opinion is a skeleton of design, philosophy of lifestyle. If somebody (Rob Pike?) invented the tool/technology to solve C++ problems, then it is cool. If you don’t like some idioms, like ok idiom for the map, then think about every technology PROs and CONs. First space ship could fly into the space, but there was no restroom in it. The technology could not provide it. So if you needed to fly more than pee, you selected the technology and flied. If you needed to pee, you went with another technology and did not fly. It’s so common sense.

I suggest to many of you to read very thin useful book “Practice of Programming”. As soon as you understand that different tools were designed to solve different problems, and multiple tools are used to solve bigger problems, you will give up with religious fanatism and stop averaging the tools. Each tool must remain as specific and laser focused on certain problem as possible. Just don’t be lazy, master many tools, physically and mentally. Consider future evolution of the tools, like the restroom in the space ship. And stop blaming go, use it when you need it, not when you want it. That’s my opinion.

Smartphone Addiction

The addiction to the screen of smartphone is really strong. Hopefully it will start to change in 2016 with Invisible Interfaces aka Natural Interfaces. 

  

Photo is mine. Copyright (c) Vasyl Mylko. 

Internet of Things @ Stockholm, Copenhagen

This is an IoT story combined from what was delivered during Q1’15 in Stockholm, Copenhagen and Bad Homburg (Frankfurt).

We stuck in the past

When I first heard Peter Thiel about our technological deceleration, it collided in my head with technological acceleration by Ray Kurzveil. It seems that both gentlemen are right and we [humanity] follow multi-dimensional spiral pathway. Kurzveil reveals shorter intervals between spiral cycles. Thiel reveals we are moving in negative direction within single current spiral cycle. Let’s drill down within the current cycle.

Cars are getting more and more powerful (in terms of horse power), but we don’t move faster with cars. Instead we move slower, because of so many traffic lights, speed limits, traffic jams, grid locks. It is definitely not cool to stare at red light and wait. It is not cool either to break because your green light ended. In Copenhagen majority of people use bikes. It means they move at the speed of 20 kph or so… Way more slower than our modern cars would have allowed. Ain’t strange?

Aircrafts are faster than cars, but air travel is slow either. We have strange connections. A trip from Copenhagen to Stockholm takes one full day because you got to fly Copenhagen-Frankfurt, wait and then fly Frankfurt-Stockholm. That’s how airlines work and suck money from your pocket for each air mile. Now add long security lines, weather precautions and weather cancellations of flights. Add union strikes. Dream about decommissioned Concord… 12 years ago already.

Smartphone computing power equals to Apollo mission levels, so what? Smartphone is used to browse people and play games mainly. At some moment we will start using it as a hub, to connect tens of devices, to process tons of data before submitting into the Cloud (because Data will soon not fit into the Cloud). But for now we under-use smartphones. I am sick of charging every day. I am sick for all those wires and adapters. That’s ridiculous.

Cancer, Alzheimer and HIV still not defeated. And there is not optimistic mid term forecast yet.

We [thinking people] admit that we have stuck in the past. We admit our old tools are not capable to bring us into the future. We admit that we need to build new tools to break into the future. Internet of Things is such a macro trend – building those new tools what would breakthrough us into the future.

Where exactly we stuck?

We are within 3rd wave of IoT called Identification and at the beginning of 4th wave of IoT called Miniaturization. Those two slightly overlap.

Miniaturization is evidence of Moore’s Law still working. Pretty small devices are capable of running same calculations as not so old desktops. Connecting industrial machinery via man-in-the-middle small device is on the rise. It is known as Machine-to-Machine (M2M). Two common scenarios here: wire protocol – to break into dumb machine’s wires and hook it there for readings and control; optical protocol – read from analog or digital screens and do optical recognition of the information.

More words about optical protocol in M2M. Imagine you are running biotech lab. You have good old centrifuges, doing their layering job perfectly. But they are not connected, so you need to read from display and push the buttons manually. You don’t want to break into the good working machines and decide to read from their screens or panels optically, hence doing optical identification for useful information. The centrifuges become connected. M2M without wires.

Identification is also on the rise in manufacturing. Just put a small device to identify something like vibration, smoke, volume, motion, proximity, temperature etc. Just attach a small device with right sensors to dumb machine and identify/measure what you are interested in. Identification is on the rise in life style. It is Wearables we put onto ourselves for measure various aspects of our activity. Have you ever wondered how many methods exist to measure temperature? Probably more than 10. Your (or my?) favorite wearables usually have thermistors (BodyMedia) and IR sensors (Scanadu).

Optical identification as powerful field of entire Internet of Things Identification requires special section. Continue reading.

Optical identification

Why optical is so important? Guess: at what bandwidth our eyes transmit data into the brain?
It is definitely more than 1 megabit per second and may be (may be not) slightly less than 10 megabit per second. For you geeks, it is not so old Ethernet speed. With 100 video sensors we end up with 1+ Terabyte during business hours (~10 hours). That’s hell a lot of data. It’s better to extract useful information out of those data streams and continue with information rather than with data. Volume reduction could be 1000x and much more if we deal with relevant information only. Real-time identification is vital for self-driving cars.

Even for already accumulated media archives this all is very relevant. How to index video library? How to index image library? It is the work for machines, to crawl and parse each frame and sequences of frames to classify what it there, remember timestamp and clip location, make a thumbnail and give this information to those who are users of the media archives (other apps, services and people). Usually images are identified/parsed by Convolution Neural Networks (CNN) or Autoencoder + Perceptron. For various business purposes, the good way to start doing visual object identification right away is Berkeley Caffe framework.

Ever heard about DeepMind? They are not on Kaggle today. They were there much earlier. One of them, Volodymyr Mnih, won the prize in early 2013. DeepMind invented some breakthrough technology and was bought by Google for $400 million (Facebook was another potential buyer of DeepMind). So what is interesting with them? Well yeah, the acquisition was conditional that Google would not abuse the technology. There is special Ethical Board set up at Google to validate use of DeepMind technology. We could try to figure out what their secret sauce is. All I know is that they went beyond dumb predefined machine learning by applying more neuroscience stuff which unlocked learning from own experience, with nothing predefined a priori.

Volodymyr Mnih has been featured in recent (at the moment of this post) issue of Nature magazine, with affiliation to DeepMind in the references. Read what they did – they build neural network that learns game strategy, ran it on old Atari games and outperformed human players on 43 games!! It is CNN, with time dimension (four chronological frames given to input). Besides time dimension, another big difference to classic CNN is delayed rewards learning mechanism, i.e. it’s true strategy from your previous moves. The algorithm is called Deep Q-learning, and the entire network is called Deep Q-learning Network (DQN). It is a question of time when DQN will be able to handle more complicated graphical screens than old Atari. They have tried Doom already. May be StarCraft is next. And soon it will be business processes and workflows…

Those who subscribed to Nature, log in and read main article, especially Methods part. Others could check out reader-friendly New Yorker post. Pay attention to Nature link there, you might be lucky to access Methods section on Nature site without subscription. Check out DQN code in Lua and DQN + Caffe and DQN/Caffe ping pong demo.

Who eats whom?

All right with importance of optical identification, hope it’s time to switch back to high-level and continue on the Internet of Things as macro trend, at global scale. Many of you got used to the statement that Software is eating the World. That’s correct for two aspects: hardware flexibility is being shifted to software flexibility; fabricators are making hard goods from digital models.

Shifting flexibility from hardware to software is huge cost reduction of maintenance and reconfiguration. The evidence of hardware eaten by software are all those SDX, Software Defined Everything. E.g. SDN aka Software Defined Networks, SDR aka Software Defined Radio, SRS aka Storage, and so of for Data Center etc. Tesla car is a pretty software defined car.

But this World has not been even eaten by hardware yet! Miniaturization of electric digital devices allows the Hardware to eat the World today and tomorrow. Penetration and reach of devices into previously inaccessible territories is stunning. We establish stationary identification devices (surveillance cameras, weather sensors, industrial meters etc.) and launch movable devices (flying drones, swimming drones, balloons, UAVs, self-driving cars, rovers etc.) Check out excellent hardware trends for 2015. Today we put plenty of remora devices onto the cars and ourselves. Further miniaturization will allow to take devices inside ourselves. The evidence is that Hardware is eating the World.

Wait, there are fabricators or nanofactories, producing hard goods from 3D models! 3D printed goods and 3D printed hamburger are evidences of Software directly eating the World. Then, the conclusion could be that Software is eating the World previously eaten by Hardware, while Hardware is eating the rest of the World at higher pace than Software is eating via fabrication.

Who eats whom? Second pass

Things are not so straightforward. We [you and me] have stuck in silicon world. Those ruminations are true for electrical/digital devices/technologies. Things are not limited to digital and electrical. The movement of biohackers can’t be ignored. Those guys are doing garage bio experiments on 5K equipment exactly as Jobs and Woz did electrical/digital experiments in their garage during PC era birth.

Biohackers are also eating the World. I am not talking about standard boring initiation [of biohacker] to make something glowing… There are amazing achievements. One of them is night vision. Electrical/digital approach to night vision is infra red camera, cooler and analog optical picture into your eyes, or radio scanner and analog/digital reconstruction of the scene for your eyes. Bio approach is injection of Chlorin e6 drops into your eyes. With the aid of Ce6 you could see in the darkness in the range of 10 to 50 meters. Though there is some controversy with that Ce6 experiment.

The new conclusion for the “Eaters Club” is this:

  • Software is eating the world previously eaten by Hardware
  • Hardware is eating the rest of the World, much bigger part than it’d already been eaten, at high pace
  • Software is eating the world slowly thru fabrication and nanofactories
  • Biohackers are eating the world, ignoring both Hardware & Software eaters

Will convergence of hardware and bio happen as it happened with software and hardware? I bet yes. For remote devices it could be very beneficial to take energy from the ambient environment, which potentially could be implemented via biological mechanisms.

sw_hw_bio

Blending it all together

Time for putting it all together and emphasizing onto practical consequences. Small and smaller devices are needed to wrap entire business (machines, people, areas). Many devices needed, 50 billion by 2020. Networking is needed to connect 50 billion devices. Data flow will grow from 50 billion devices and within the network. Data Gravity phenomenon will become more and more observable, when data attracts apps, services and people to itself. Keep reading for details.

Internet of Things is a sweet spot at the intersection of three technological macro trends: Semiconductors, Telecoms and Big Data. All three parts work together, but have different evolution pace. That’s lead to new rules of the ‘common sense’ emerging within IoT.
1_IoT

Remote devices need networking, good networking. And we got an issue, which will only strengthen. The pace of evolution for semiconductors is 60%, while the pace of evolution of networks is 50%. The pace of evolution of storage technology is even faster than 60% annually. It means that newly acquired data will fit into the network less and less in time [less chances for data to get into the Cloud] . It means that more and more data will be left beyond the network [and beyond the Cloud].

Off-the-Cloud data must be handled in-place, at location of acquisition or so. It means huge growth of Embedded Programming. All those small and smaller devices will have to acquire, store, filter, reduce and sync data. It is Embedded Programming with OS, without OS. It is distributed and decentralized programming. It is programming of dynamic mesh networks. It is connectivity from device to device without central tower. It is new kind of the cloud programming, closest to the ground, called Fog. Hence Fog Programming, Fog Computing. Dynamic mesh networks, plenty of DSP, potentially applicable distributed technologies for business logic foundation such as BitTorrent, Telehash, Blockchain. Interesting times in Embedded Programming are coming. This is just Internet of Things Miniaturization phase. Add smart sensing on those P2P connected small and smaller devices in the Fog, and Internet of Things Identification phase will be addressed properly.

The Reference Architecture of IoT is seven-layered (because 7 is a lucky number?).
5_7layers

Conclusion

We are building new tools that we will use to build our future. We’re doing it through digitization of the World. Everything physical becomes connected and reflected into its digital representation. Don’t overfocus onto Software, think about Hardware. Don’t overfocus onto Hardware, think about Bio. Expected convergence of software-hardware-bio as most stable and eco-friendly foundation for those 50 billion devices by 2020.

Recall Peter Thiel and biz frustrations nowadays. With digitized connected World we will turn from negative direction within current spiral cycle into positive. And of course we will continue with long term acceleration. The future looks exciting.

PS.
Music for reading and thinking: from the near future, Blade Runner, Los Angeles 2019

Tagged , , , , , , , , , , , , , , , ,

Svindal: Crash, Training, Recovery

Crash

He had multiple injuries… broken bones in his face and a 15 cm laceration to his groin and abdominal area. Svindal missed the remainder of the 2008 season.

Training

20 minutes on bike, then balance and stretching on balls, then 140 kilograms… then many other stuff… then again 10 minutes on the bike. May be Red Bull helps.

Recovery

His first two victories following his return were a downhill and a super-G in Beaver Creek, on the same Birds of Prey course where he was injured the year before. Remarkable comeback from his crash a year ago, true testament to mental strength. BTW his focus is so strong that he does not blink once during a 2 minute run…

Tagged , , , , ,

Grand Piano Redesign

Non-IT post. It’s about design and music.

When I played piano I used to pay attention to the instruments visual look, keyboard feedback and produced sound. In that exact order. If the piano didn’t look aesthetic, didn’t mirror everything in front panel, didn’t have forms or nice wooden patterns, I didn’t like that instrument. Keyboard was may be most important, but I still rank it second. If keyboard was soft and easy, it was disaster. My favorite keyboard had to be pretty demanding, you had to make your fingers kicking each key properly. That was required for speedy playing. Really very fast playing, faster than teachers. It was impossible on easy keyboard somehow… The sound was last. I liked the sound of German grand piano, didn’t like the sound of Estonian ones. Though it was only third priority.

So what do we have today? If I started piano playing again, what would be the priorities? The visual look would remain a number one. And I would like modern design. Check out what’s emerged during these years… May be it worth a try?

Whaletone

whaletone

Peugeot

peugeot

Boganyi

boganyi

Tagged , ,

Eye

My drawing during boring lecture in university, called “Eye”. Year 1993. Lviv.

/home/wpcom/public_html/wp-content/blogs.dir/5b2/34542052/files/2014/12/img_1135.png

Tagged , , ,

Security in IoT

@ 4th Annual Nordic Cloud & Mobile Security Forum in Stockholm

 

Internet of Things
as I understand it

IoT emerges at the interaction of Semiconductors, Telecoms, Big Data and their laws. Moore’s Law for Semiconductors, observed as 60% annual computing power increase. Nielsen’s Law for Telecoms, observed as 50% annual network bandwidth increase; Metcalfe’s Law for networks, observed as value of the network proportional to the squared number of connected nodes (human and machines, many-to-many). Law of Large Numbers is observed as known average probabilities for everything, that you don’t need statistics anymore. On Venn diagram IoT looks smaller than either of those three foundations – Semiconductors, Telecoms and Big Data, but in reality IoT is much bigger, it is digitization and augmentation of our physical world, both in business and lifestyle.

1_IoT

How people recognize IoT? Propably some see only one web, some see another web, others see few webs? There are good known six webs: Near, Hear, Far, Weird, B2B, D2D [aka M2M]. Near is laptop, PC. Hear is smartphone, smartwatch, armband, wristband, chestband, Google Glass, shoes with some electronics. Far is TV, kiosk, projection surface. Weird is voice and gesture interface to Near and Far, with potential new features emerging. B2B is app-to-app or service-to-service. D2D is device-to-device or machine-to-machine.

People used to sit in front of computer, now we sit within big computer. In 3000 days there will be super machine, let’s call it One, according to Kevin Kelly. It’s operating system is web. One identifies and encodes everything. All screens look into One. One can read it all, all data formats from all data sources. To share is to gain, yep, sharing economy. No bits live outside of One. One is us.

2_6webs

Where we are today
or five waves of IoT

Today we are at Identification of everything, especially visually; and Miniaturization of everything, especially with wearables and M2M. High hopes are onto visual identification and recognition. On the one hand, ubiqutous identification is just needed. On the other hand, visual recognition and classification is probably the way to security in IoT. Instead of enforcing tools or rules, there are policies and some control how those policies applied. The rationale is straightforward: technologies change too fast, hence to build something lasting, you should build policies. Policies are empowered by some technology, but remain other technologies agnostic.

3_5waves

Fifth wave is augmentation of life with software and hardware…

Who is IoT today? Let’s take Uber. Today it is not. In several years with self-driven cars it will be. Tim O’Reilly perfectly described IoT as ecosystem of things and humans. Below is comparison, with significantly extended outlook of tomorrow.

4_uber

It is great step towards personalized experience that Uber linked Spotify to your cab, so that you experience your individual stage in any Uber car. More about personal experience in my previous post Consumerism via IoT, delivered in Munich.

IoT Reference Architecture
or magic of seven continues

Well, high-level mind-washing stuff is interesting, but is there a canonical architecture for IoT? What could I touch as an engineer? There is reference architecture [revealed several weeks ago by Cisco, Intel and others], consisting of seven layers, shown below:

5_7layers

Notice that upper part is Information Technology, which is non-real-time, and which must be personalized. Lower part is Operational Technology, which is real-time or near-real-time, and which is local and geo-spread. Central part is Cloud-aware, which is IT and it’s centralized with strategic geo-distribution, with data centers for primary internet hubs and user locations.

From infosec point of view, top level is broken, i.e. people are broken. They continue to do stupid things, they are lazy, so it’s not rational to try to improve people. They will drive you crazy with BYOD, BYOA and BYOT (bring your own device/app/technology). It is better to invest into technologies which are secure by design. Each architectural layer has own technological & security standards, reinforced by industry standards. Really? Yes for upper part and not obvious for the lower…

Pay attention to the lower part, from Edge Computing and downstairs. It is blurred technology as for today, it could be called Fog. Anyway, Cisco calls it Fog. The Fog perfectly reflects the closest cloud to the ground; encapsulates plenty of computing, storage and networking functionality within. Fog provides localization and context awareness with low latency. Cloud provides global centralization, probably with some latency and less context. Experience on top of Cloud & Fog should provide profiling and personalization, personal UX. The World is flat. The World is not flat. It’s depends on which layer of IoT you are now.

Edge of computing
or computing at the Edge

Data growths too fast, that in many scenarios it simply can’t be moved to the Cloud for intelligence; hence BI comes to the Data. Big Data has big gravity and it attracts apps, services to itself. And hackers too. Gathering, filtering, normalizing, accumulating data at location or elsewhere, outside the cloud, is called Edge Computing. It is often embedded programming of single-card computers or other mediums (controllers, Arduino, Raspberry Pi, Tessel.io, smartphones when much computing power required).

6_fog0

 

Fog Computing
or cloud @ data sources

Fog Computing is a virtualized distributed platform that provides computing, storage, and networking services between devices and the cloud. Fog Computing is widespread, uncommon, interconnected. Fog Computing is location-aware, real-time/near-real-time, geo-spread, large-scale, multi-node, heterogeneous. Check out http://www.slideshare.net/MichaelEnescu/michael-enescu-cloud-io-t-at-ieee

6_fog1

Fog is hot for infosec, because plenty of logic and data will sit outside of the cloud, outside of the office, somewhere in the field… so vulnerable because of immaturity of IoT technologies at that low level.

Secure Fog Fabric
or security by design

How to find or build technologies for the Fog Computing, which would be secure by design? Which would live quite long, like TCP/IP:) Is it possible? Are some candidate technologies exist so far? And potentially they should be built on top of proven open-sourced tools & technologies, to keep trust and credibility. It all must synergize at large collaboration scale to breakthrough with proper tech fabric. So what do we have today? Fog is about computing, storage and networking, just a bit different from the same stuff in the cloud or in the office.

Computing. Which computing is secure, transactional and distributed? And could fit onto Raspberry Pi? Ever thought about Bitcoin? Ha! Bitcoin’s Block Chain algorithm is exactly the secure transactional distributed engine, even platform. Instead of computing numbers for encryptions and mine Bitcoins, you could do more useful computing job. Technology has all necessary features included in it by design. Temporary and secure relations are established between smartphones and gadgets, devices and transactions happen. Check out Block Chain details.

Storage. Data sending & receiving. Which technology is distributed, efficient of low-bandwidth networks, reliable and proven? BitTorrent! BitTorrent is not for pirates, it is for Fog Computing. For mesh networks and efficient data exchange on many-to-many topologies, built over P2P protocol. BitTorrent is good for video streaming too. Check out BitTorrent details .

Identification. Well, may be it’s not identification of everything and everyone, but authentication and authorization is needed anyway, and needed right now. Do we have such technology? Yes, it is Telehash! Good for mesh networks, based on JSON, enables secure messaging. Check out Telehash details.

6_fog2

Fog Computing is new field, we have to use applicable secure technologies there, or create new better technologies. Looks like it is going to be hybrid, something applied, something invented. Check out original idea from IBM Research for original arguments and ideation.

Security for IoT

A proposal is to go ahead with OWASP Top 10 for IoT. Just google for OWASP and code like I10 or I8. You will get the page with recommendations how to secure certain aspect of IoT. The list of ten doesn’t match seven layers of reference architecture precisely, while some relevance is obvious. Some layers are matched. Some security recommendations are cross-functional, e.g. Privacy.

7_OWASP

For Fog Computing pay attention to I2, I3, I4, I7, I9, I10. All those recommendations could be googled by those names; though they are slightly different at OWASP site. Below is a list of hyperlinks for your convenience. Enjoy!

I1 Insecure Web Interface
I2 Insufficient Authentication/Authorization
I3 Insecure Network Services
I4 Lack of Transport Encryption
I5 Privacy Concerns
I6 Insecure Cloud Interface
I7 Insecure Mobile Interface
I8 Insufficient Security Configurability
I9 Insecure Software/Firmware Updates
I10 Poor Physical Security

More about Internet of Things, especially from user point of view could be found at my recent post Consumerism via IoT.

Tagged , , , , , , , , , , , , , , , , ,