Tag Archives: usability

Color Paradigm Shift

This post will be about colors and emotions. Especially about modern relations between colors and emotions.

Old School?

Take a look at this color wheel of emotions, it is available in high resolution and better quality online as PDF.
wheel

Questions

Let’s question several colors exclusively, e.g. Red, Orange, Violet, Blue, Yellow…

Red

The color wheel says it is about feeling angry, irate, frustrated, aggravated. Really? What about Christmas cards and stripes? They are all red. Santa Claus is all red. St Valentine stuff is all red and it is about love. All wedding packages are using red for love and passion. On metropolitan maps you have “You are here” in red. On Google Maps you have red location balloons. Women’s lips are red. Children character Mickey Mouse is red. Strawberries, watermelon, cherries, apples, pomegranates are red. There are so many positive reds in the nature, society, technology etc. and they are no way negative. Instead they are pretty positive.
red

Orange, Yellow

The color wheel says orange is nervous, worried, concerned, confused; yellow is afraid, apprehensive… Let’s see what we have and use in orange and yellow and check out emotions. Oranges are no way nervous or concerned, it is common daily juice. Yellow dresses are cool. Orange Lambo is standard color and all good with emotions.
orange_yellow

Violet, Blue

The color wheel says violet is disgust, distate, disappoint; blue is sad, grief. Let’s check it around us. What about violet fairy? Violet lavender fields, violet blossom? Blue is common on concerts, check out pleasant folk music by Blackmore’s Night… Horizon is often blue and there is nothing sad about it.
violet_blue

Conclusion

The perception of colors is different today, seems it was different for years. That emotion color wheel is wrong. There are so many real life cases when vivid violet, orange, blue are used and it looks positive, emotionally positive. Check out cutting-edge wearable products from Jawbone, ensure there is nothing wrong with the colors at all. Emotions are driven by combinations, via balance, via synergy, but never by damn single color.
jawbone

PS.
Don’t know how to add more vertical space before headers. WordPress usability sucks.

Advertisements
Tagged , , , , ,

Advanced Analytics, Part V

This post is related to the details of visualization of information for executives and operational managers on the mobile front-end. What is descriptive, what is predictive, what is prescriptive, how it looks like, and why. The scope of this post is a cap of the information pyramid. Even if I start about smth detailed I still remain at the very top, at the level of most important information without details on the underlying data. Previous posts contains introduction (Part I) and pathway (Part II) of the information to the user, especially executives.

Perception pipeline

The user’s perception pipeline is: RECOGNITION –> QUALIFICATION –> QUANTIFICATION –> OPTIMIZATION. During recognition the user just grasps the entire thing, starts to take it as a whole, in the ideal we should deliver personal experience, hence information will be valuable but probably delivered slightly different from the previous context. More on personal experience  in next chapter below. So as soon as user grasps/recognizes she is capable to classify or qualify by commonality. User operates with categories and scores within those categories. The scores are qualitative and very friendly for understanding, such as poor, so-so, good, great. Then user is ready to reduce subjectivity and turn to the numeric measurements/scoring. It’s quantification, converting good & great into numbers (absolute and relative). As soon as user all set with numeric measurements, she is capable to improve/optimize the biz or process or whatever the subject is.

Default screen

What should be rendered on the default screen? I bet it is combination of the descriptive, predictive and prescriptive, with large portion of space dedicated to descriptive. Why descriptive is so important? Because until we build AI the trust and confidence to those computer generated suggestions is not at the level. That’s why we have to show ‘AS IS’ picture, to deliver how everything works and what happens without any decorations or translations. If we deliver such snapshot of the business/process/unit/etc. the issue of trust between human and machine might be resolved. We used to believe that machines are pretty good at tracking tons of measurements, so let them track it and visualize it.

There must be an attempt from the machines to try to advice the human user. It’s could be done in the form of the personalized sentence, on the same screen, along with descriptive analytics. So putting some TODOs are absolutely OK. While believing that user will trust them and follow them is naive. The user will definitely dig into the details why such prescription is proposed. It’s normal that user is curious on root-cause chain. Hence be ready to provide almost the same information with additional details on the reasons/roots, trends/predictions, classifications & patterns recognition within KPI control charts, and additional details on prescriptions. If we visualize [on top of the inverted pyramid] with text message and stack of vital signs, then we have to prepare additional screen to answer that list of mentioned details. We will still remain on top of the pyramid.

default_screen

 

Next screen

If we got ‘AS IS’ then there must be ‘TO BE’, at least for the symmetry:) User starts on default screen (recognition and qualification) and continues to the next screen (qualification and quantification). Next screen should have more details. What kind of information would be logically relevant for the user who got default screen and looks for more? Or it’s better to say – looks for ‘why’? May be it’s time to list them as bullets for more clarity:

  • dynamic pattern recognition (with highlight on the corresponding chart or charts) what is going on; it could be one from seven performance signals, it should be three essential signals
  • highlight the area of the significant event [dynamic pattern/signal] to the other charts to juxtapose what is going on there, to foster thinking on potential relations; it’s still human who thinks, while machine assists
  • parameters & decorations for the same control charts, such as min/max/avg values, identifications of the quarters or months or sprints or weeks or so
  • normal range (also applicable to the default screen) or even ranges, because they could be different for different quarters or years
  • trend line, using most applicable method for approximation/prediction of future values; e.g. release forecast
  • its parts should be clickable for digging from relative values/charts into the absolute values/charts for even more detailed analysis; from qualitative to quantitative
  • your ideas here

signal

Recognition of signals as dynamic patterns is identification of the roots/reasons for smth. Predictions and prescriptions could be driven by those signals. Prescriptions could be generic, but it’s better to make personalized prescriptions. Explanations could be designed for the personal needs/perception/experience.

 

Personal experience

We consume information in various contexts. If it is release of the project or product then the context is different from the start of the zero sprint. If it’s merger & acquisition then expected information is different from the quarterly review. It all depends on the user (from CEO to CxOs to VPs to middle management to team management and leadership), on the activity, on the device (Moto 360 or iPhone or iPad or car or TV or laptop). It matters where the user is physically, location does matter. Empathy does matter. But how to reach it?

We could build users interests from social networks and from the interaction with other services. Interests are relatively static in time. It is possible to figure out intentions. Intentions are dynamic and useful only when they are hot. Business intentions are observable from business comms. We could sense the intensity of communication between the user and CFO and classify it as a context related to the budgeting or budget review. If we use sensors on corporate mail system (or mail clients), combine with GPS or Wi-Fi location sensors/services, or with manual check-in somewhere, we could figure out that the user indeed intensified comms with CFO and worked together face-to-face. Having such dynamic context we are capable to deliver the information in that context.

The concept of personal experience (or personal UX) is similar to the braid (type of hairstyle). Each graph of data helps to filter relevant data. Together those graphs allows to locate the real-time context. Having such personal context we could build and deliver most valuable information to the user. More details how to handle interest graph, intention graph, mobile graph, social graph and which sensors could bring the modern new data available in my older posts. So far I propose to personalize the text message for default screen and next screen, because it’s easier than vital signs, and it’s fittable into wrist sized gadgets like Moto 360.

Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Wearable Technology

Right time to wrap a short story about the modern technology. Today I will talk about what will be massively adopted in three years from now. It is Wearable Technology. Everything is very simple – the current smartphones will not last long, because their design sucks. The “brick design” is not human friendly. The bigger smartphones are even worse than smaller ones. It is a result of immature technology to produce better hardware. But things have started to change.

The Momentum

The documented research for wearables for consumer market is dated back to the end of 1990x, which constitutes approx 15-17 years. Usually it took about 25 years for new technology to win us over. But taking into consideration the exponential acceleration of our technology progress, the time frame is shrinking. The adoption has began and you will see what will happen in three years. Just to make a relevant analogy, recall what would you know about the tablets before April 2010? There was nothing sustainable on the market. Old tablets from Microsoft don’t count because of poor sustainability… And look at your hands right now, look around, many people hold iPads, and they use them for both work and life needs. That happened in 3 years. The hardware changes in 3 years. Google Glass is exciting, they tease us today, but everybody will be with Google and some competitors glasses in just few years.

Software + Services?

Now, take few minutes to read the spec of Google Glass. It is Android. It is a platform for the apps. Because web browser is too heavy for such limited environment. Sorry SaaS, you will have to wait until the mini ear of apps will come to the logical end. But today it is a perfect time for the apps, embedded or Android or similar. Take another make, Suunto. They converted wearable accessories into the app platforms. You can download the app for your favorite Ambit from the app zone. Let’s move further within quantified-self trend. Next stop is a health. There are nice wearable products for nutrition, fitness and sleep monitoring. From very sexy fashionable Jawbone to more classical looking Fitbit to sporty Nike Fuel to modest calories burn tracker BodyMedia. Those wearable buddies simply don’t have neither room nor battery capacity for web browser (read for fat HTML and JavaScript processing). Of course they synchronize with smartphones, then with backend. It is new model: Software + Software + Services, where first software is embedded, second is mobile app and services are in the cloud.

Could wearable gadgets work without the smartphone? Yes and no. They all work in offline mode, silently gathering/measuring you. Then you connect via Bluetooth to smartphone or with USB wire to laptop and sync the data with the cloud services. Theoretically the wearable gadget can bypass the phone and work directly with the services. It is called Internet of Things, when each “thing” is connected to the Internet, or vice verse, the Internet is constituted from the connected “things”. The more things connected, the better real world is duplicated/reflected into the information system. The goal is to replicate the physical World into digital.

If we take different class of devices, which connects other machines to the internet, we have to mention M2M. It is very simple concept, to connect smth disconnected, you have to put additional machine, aware of the first machine on one side and Internet on another side. Those M2M devices are usually working over HTTP with backend and via USB or COM (with native protocol) with original disconnected machine. Hence, there is a significant piece of software on M2M man-in-the-middle. There could be high-level logic running there, e.g. predictive or prescriptive analytics, which can work even in offline mode, if the accumulated or sync’ed data is sufficient that that. Software + Services is what we need to understand to design Internet of Things for machines and for people. Wearable technology will run software next three years for sure.

Wearables!

The “brick design” is bad, but what is better? Any other anatomic designs are way better. Wait a bit and there will be new stuff built out of Graphene. It has ideal properties to be an excellent component of chips and circuits. There are billion euro grants for this decade to bring Graphene into our lives. Old good Nokia got 1.35b euro to develop Graphene. May be they will make the Morph concept a reality!

Who worries about the chips? They are almost transparent today. The first transparent integrated circuit was built in far 2006. Today, achieving of 80% of transparency with chips is not a problem. Having Graphene as a basis, guarantee that modern devices will be easily bendable. All those tablets still lose to paper because they are not bendable. We almost achieved the paper resolution (dpi) on tablets, but still struggle with every-day usability.

Quantified Self

Everybody is going to be measured. If you want to measure yourself, just use wearable technology and track the rest of your activity with software products. Just put smart wristband, health monitor, GPS or whatever onto yourself, your bike and forget about it. The data will be continuously recorded. If you don’t like to track yourself, you still cannot avoid it, because big stores will track you via cell phone triangulation. They use fake signaling to make your phone revealing itself, thus identifying your location and movement inside the store. You are tracked via surveillance cameras too… but it is not wearable, hence out of the scope of this topic.

Don’t worry about the wearable guys, they will produce tons of data about you. While using tools to record other meaningful activities might be painful. Here is a sample how the man experienced it with Google Calendar. I foresee that non-wearable tracking will be resolved via image/object recognition. Retail industry needs it for descriptive, predictive and prescriptive analytics very much.

Internet of Things

Those things that could not be designed to work with cloud directly, will be classified as smart accessories to some smarter device – smartphone. At the beginning there will be smartphone as man-in-the-middle for almost all wearable devices. Like AliveCor ECG for iPhone. But then, those devices will be able to bypass the phone. In other words, we are using smartphone as M2M intermediary, to connect wearable machine to the Internet. We got Internet of Things right away, just not as big as it could be, and with heavy use of the phone for M2M. In the near future we will get even bigger Internet of Things without the phones, as wearable gadgets will be designed as connected right out of the box.

Ideally, there will be no need for visible computer (either laptop or smartphone or tablet) in common situations. The number of computers/machines will be bigger, and the size will be smaller. And many of them will be Wearable and connected.

Tagged , , , , , , , , ,

Mobile Home Screens

Mobile Home Screens

Better Home Screen?

Definitely without iPhone’ish glossy icons,
wasting potentially useful space around.
Want it or not, but glossy must stay in the past.
Near future is flat.

With aligned multi-sized widgets like Winphone.
With variety of widgets like Android.
But much more aesthetic than they are today!

With more information embedded into the icon-sized area,
like Facebook Home, three faces into small context area.
Ideally the home screen can delivery useful information in five-six different contexts even without any clicks on them.

Smaller widgets will remain and prevail,
because they are sized to our finger…

Tagged , , , , , , , , , , , , , , , , , , , , , , , ,

Mobile UX: home screens compared

35K views

Some time in 2010 I’ve published my insight on the mobile home screens for four platforms: iOS, Android, Winphone and Symbian. Today I’ve noticed it got more than 35K views:)

What now?

What changed since that time? IMHO Winphone home page is the best. Because it allows to deliver multiple locuses of attention, with contextual information within. But as soon as you go off the home screen, everything else is poor there. iOS and Android remained lists of dumb icons. No context, no info at all. The maximum possible is small marker about the number of calls of text messages. And Symbian had died. RIP Symbian.

So what?

Vendors must improve the UX. Take informativeness of Winphone home screen, add aesthetics of iOS graphics, add openness & flexibility of Android (read Android First) and finally produce useful hand-sized gadget.

Winphone’s home screen provides multiple locuses of attention, as small containers of information. They are mainly of three sizes. The smallest box has enough room to deliver much more context information than number of unread text messages. By rendering the image within the box we can achieve the kind of Flipboard interaction. You decide from the image whether you interested in that or not. It is second question how efficiently the box room is used. My conclusion that it is used inefficiently. There are still number of missed calls or texts with much room left unused:( I don’t know why the concept of the small contexts has been left underutilized, but I hope it will improve in the future. Furthermore, it could improve on Android for example. Android ecosystem has great potential for creativity.

May be I visualize this when get some spare time… Keep in touch here or Slideshare.

Tagged , , , , , , , , , , , , , , , , , , , , , , ,

Mobile EMR, Part V

Some time ago I’ve described ideation about mobile EMR/EHR for the medical professionals. We’ve come up with tablet concept first. EMR/EHR is rendered on iPad and Android tablets. Look & feel is identical. iPad feels better than Samsung Galaxy. Read about tablet EMR from four previous posts. BTW one of them contains feedback from Edward Tufte:) Mobile EMR Part I, Part II, Part III, Part IV.

We’ve moved further and designed a concept of hand-sized version of the EMR/EHR. It is rendered on iPhone and Android phones. This post is dedicated to the phone version. As you will see, the overall UI organization is significantly different from tablet, while reuse of smaller components is feasible between tablets and phones. Phone version is totally SoftServe’s design, hence we carry responsibility for design decisions made there. For sure we tried to keep both tablet and phone concepts consistent in style and feel. You could judge how good we accomplished it by comparing yourself:)

Patients

The lack of screen space forces to introduce a list of patients. The list is vertically scrolled. The tap on the patient takes you to the patient details screen. It is possible to add very basic info for each patient at the patient list screen, but not much. Cases with long patient names simply leave no space for more info. I think that admission date, age and sex labels must be present on the patient list in any case. We will add them in next version. Red circular notification signals about availability of new information for the patient. E.g. new labs ready or important significant event has been reported. The concept of interaction design supposes that medical professional will click on the patient marked with notifications. On the other hand, the list of patients is ordered per user. MD can reorder the list via drag’n’drop.

Patient list

Patient list

MD can scan the wristband to identify the patient.

Wristband scanning

Wristband scanning

Patient details

MD goes to the patient details by tapping the patient from the list. That screen is called Patient Profile. It is long screen. There is a stack of Vital Signs right on top of the screen. Vital Signs widget is totally reused from tablets on the phones. It fits into the phone screen width perfectly. Then there is Meds section. The last section is Clinical Visits & Hospitalization chart. It is interactive (zoomable) like on iPad. Within single patient MD gets multiple options. First options is to scroll the screen down to see all information and entry points for more info available there. Notice a menu bar at the bottom of the screen. MD can prefer going directly to Labs, Charts, Imagery or Events. The interaction is organized as via tabs. Default tab is patient Profile.

Patient profile

Patient profile

Patient profile, continued

Patient profile, continued

Patient profile, continued

Patient profile, continued

Labs

There is not much space for the tables. Furthermore, labs results are clickable, hence the size of the rows should be relative to the size of the the finger tap. Most recent labs numbers are highlighted with bold. Deviation from the normal range is highlighted with red color. It is possible to have the most recent labs numbers of the left and on the right of the table. It’s configurable. The red circular notification on the Labs menu/tab informs with the number how many new results available since last view on this patient.

Labs

Labs

Measurements

Here we reuse ‘All Data’ charts smoothly. They perfectly fit into the phone screen. The layout is two-column with scrolling down. The charts with notifications about new data are highlighted. MD can reorder charts as she prefers. MD can manage the list of charts too by switching them on and off from the app settings. There could be grouping of charts based on the diagnosis. We consider this for next versions. Reminder about the chart structure. Rightmost biggest part of the chart renders most recent data, since admission, with dynamics. Min/max depicted with blue dots, latest value depicted with red dot. Chart title also has the numeric value in red to be logically linked with the dot on the chart. Left thin part of the chart consist of two sections: previous year data, and old data prior last year (if such data available). Only deviations and anomalies are meaningful from those periods. Extreme measurements are comparable thru the entire timeline, while precise dynamics is shown for the current period only. More information about the ‘All Data’ concept could be found in Mobile EMR, Part I.

Measurements in 'All Data' charts

Measurements in ‘All Data’ charts

Tapping on the chart brings detailed chart.

Measurement details

Measurement details

Imagery

There was no a big deal to design entry point into the imagery. Just two-column with scroll down layout, like for the Measurements. Tap on the image brings separate screen, completely dedicated to that image preview. For the huge scans (4GB or so) we reused our BigImage solution, to achieve smooth image zoom in and zoom out, like Google Maps, but for medical imagery.

Imagery

Imagery

Tissue scan, zoom in

Tissue scan, zoom in

Significant events & notes

Just separate screen for them…

Significant events

Significant events

Conclusion: it’s BI framework

Entire back-end logic is reused between tablet and phone versions on EMR. Vital Signs and ‘All Data’ charts are reusable as is. Clinical Visits & Hospitalization chart is cut to shorter width, but reused easily too. Security components for data encryption, compression are reused. Caching reused. Push notification reused. Wristband scanning reused. Labs partially reused. Measurements reused. BigImage reused.

Reusability is physical and logical. For the medical professional, all this stuff is technology agnostic. MD see Vital Signs on iPad, Android tablet, iPhone and Android phone as a same component. For geeks, it is obvious that reusability happens within the platform, iOS and Android. All widgets are reusable between iPad and iPhone, and between Samsung Galaxy tab and Samsung Galaxy phone. Cloud/SaaS stuff, such as BigImage is reusable on all platforms, because it Web-based and rendered in Web containers, which are already present on each technology platform.

Most important conclusion is a fact that mEMR is a proof of BI Framework, suitable for any other industry. Any professional can consume almost real-time analytics from her smartphone. Our concept demonstrated how to deliver highly condensed related data series with dynamics and synergy for proper analysis and decision making by professional; solution for huge imagery delivery on any front-end. Text delivery is simple:) We will continue with concept research at the waves of technology: BI, Mobility, UX, Cloud; and within digitizing industries: Health Care, Biotech, Pharma, Education, Manufacturing. Stay tuned to hear about Electronic Batch Record (EBR).

Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Mobile EMR, Part IV

This is continuation of Mobile EMR, Part III.

It happened to be possible to fit more information to the single pager! We’ve extended EKG slightly, reworked LABs results, reworked measurements (charts) and inserted a genogram. Probably the genogram brings majority of new information in comparison to other updates.

v4 of mEMR concept

Right now the concept of mobile EMR looks this way…

Mobile EMR v4

Mobile EMR v4

New ‘All Data’ charts

Initially the charts of measured values have been from dots. Recent analysis and reviews tended to connect the dots, but things are not so straightforward… There could be kind of sparkline for the current period (7-10 days). Applicability of sparkline technique to represent data from the entire last year is suspicious. Furthermore, if more data is available from the past, then it will be a mess rather than a visualization, because there is so narrow space allocated for old data. Sure, the section of the chart could be wider, but does it worth it?

What is most informative from the past periods? Anomalies, such as low and high values, especially in comparison with current values. Hence we’ve left old data as dots, previous year data as dots, and made current short period as line chart. We’ve added min/max points to ease the analysis of the data for MD.

Genogram

Having genogram on the default screen seems very useful. User testing needed to test the concept on real genograms, to check the sizes of genograms used most frequently. Anyhow, it is always possible to show part of the genogram as expanded diagram, while keep some parts collapsed. The genogram could be interactive. When MD clicks on it, she gets to the new screen totally devoted to the genogram with all detailed attributes present. Editing could be possible too. While default screen should represent such view onto the genogram that relates to the current or potential diagnosis the patient has.

In the future the space allocated for the genogram could be increased, based on the speed of evolution of genetic-based treatments. May be visualization of personal genotyping will be put onto the home screen very soon. There are companies providing such service and keeping such data (e.g. 23andme). Eventually all electronic data will be integrated, hence MDs will be able to see patients genotyped data from EMR app on the tablet.

DNA Sequence

This is mid term future. DNA sequencing is still a long process today. But we’ve got the technology how to deliver DNA sequence information onto the tablet. The technology is similar to BigImage(tm). Predefined levels of information deliver could be defined, such as genes, exoms and finally entire genotype. For sure additional layers overlays will be needed to simplify visual perception and navigation thru the genetic information. So technology should be advanced with that respect.

Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Mobile EMR, Part III

This is continuation of previous posts Mobile EMR, Part I and Mobile EMR, Part II

We’ve met with Mr.Tufte and demo’ed this EMR concept. He played with it for a while and suggested list of improvements, from his point of view.

‘All Data’ charts

Edward Tufte insists that sparklines work better than dots. It is OK that sparklines will be of different sizes. It is natural that each measurement has its own normal range. Initially we tried to switch the charts to the lines, but then we rolled back. Seems that we should make this feature configurable, and use sparklines by default. But if some MD wants dots, she can manually switch it in app settings.

Partially our EMR concept has been switched to sparklines – for display of Vital Signs. Below is a snapshot:

Vital Signs

One more thing related to the Vital Signs, we did great by separating on the widget on top, and grouping them together. It adds much value, because they are related to each other. It is important to see what happened to them at each moment. Our approach, based on user testing, appeared to be a winning one!

Space

Current use of the space could be improved even more. First reason is that biggest value of that research was keeping ‘All Information’ on single screen. Human eye recognizes perfectly which type of information is needed. All space is tessellated into multiple locuses of attention. Then human eye locks the desired locus and then focuses within that locus. Second reason is iPad resolution. We can squeeze more from retina resolution without degradation of usability (like size of labels and numbers). It is possible to scale to the newspaper typography on iPad, hence fit more information into the screen estate.

Genogram

This confirms the modern trend to genetics and genetic engineering. Genogram is a special type of diagram, visualizing patient’s family relationships and medical history. In medicine, medical genograms provide a quick and useful context in which to evaluate an individual’s health risks. Many new treatments are tailored by genotype of the patients. E.g. Steve Jobs’s cancer was periodically sequenced and brand new proteins where applied, to prevent disease spread. All cells are built from the proteins, reading other proteins as instructions. This is true for the cancer cells. Thus if they read instructions from fake proteins, then they can not build themselves properly. We like this idea immediately, because its value is instant and big, its importance is as high as allergy. Below is sample genogram, using special markers for genetically influenced diseases.

Sample Genogram

There are other cosmetic observations which will be improved shortly. We continue usability testing with medical doctors. More to come. It could be Mobile EMR on iPhone. Stay tuned.

UPDATE: Continued on Mobile EMR, Part IV.

Tagged , , , , , , , , , , , , , , , , , , , , , , ,

Mobile EMR, Part II

On 27th of August I’ve published Mobile EMR, Part I. This post is a continuation.

The main output from initial implementation was feedback from users. They just needed even more information. We initially considered mEMR and All Information vs. Big Data. But it happened that some important information was missing from the concept relied on Powsner/Tufte research. Hence we have added more information and now ready to show the results of our research.

First of all, data is still slightly out of sync, so please be tolerant. It is mechanical piece of work and will be resolved as soon as we integrate with hospital’s backend. The charts on the default screen will show the data most suitable for the each diagnosis. This will be covered in Part III when we are ready with data.

Second, quick introduction for redesign of the initial concept. Vital Signs had to go together, because they deliver synergetically more information when seen relatively to each other. Vital Signs are required for all diagnosis. Hence we have designed a special kind of chart for vital signs and hosted it on top of the iPad. Medications happened to be extremely important, so that physician instantly see what meds are used right now, reaction of vital signs, diagnosis and allergy, and significant events. All other charts are specific to the diagnosis and physician should be able to drag’n’drop them as she needs. It is obvious that diabetes is cured differently than Alzheimer. Only one chart has its dedicated place there – EKG. Partially, EKG is connected to the vital signs, but historically (and technically too) the EKG chart is complemently different and should be rendered separately. Below is a snapshot of the new default screen:

Default Screen (with Notes)

Most important notes are filtered as Significant Events and could be viewed exclusively. Actually default screen can start with Significant Events. We just don’t have much data for today’s demo. Below is a screenshot with Significant Events for the same patient.

Default Screen (with Significant Events)

Charts are configurable like apps on iPad. You tap and hold the one, then move to the desired place and release it. All other charts are ordered automatically around it. This is very useful for the physician to work as she prefers. It’s a good opportunity to configure the sets according to diagnosis. Actually we embedded pre-sets, because it is obvious that hypertension disease is cured differently than cut wound. Screenshot below shows some basic charts, but we are working on its usability. More about that in Part III some time.

Charts Configuration

According to Inverted Pyramid , default screen is a cap of the information mountain. When many are hyping around Big Data, we move forward with All Information. Data is a low-level atoms. Users need information from the data. Our mEMR default screen delivers much information. It can deliver all information. It is up to MD to configure the charts that are most informative in her context. MD can dig for additional information on demand. Labs are available on separate view, groupped into the panels. Images (x-rays) are available on separate view too. MD can click onto the tab IMAGERY and switch to the view with image thumbnails, which correspond to MRIs, radiology/x-ray and other types of medical imagery. Clicking on any thumbnail leads to the image zoomed to the entire iPad screen estate. The image becomes zoomable and draggable. We use our BigImage(tm) IP to empower image delivery of any size to any front end. The interaction with the image is according to Apple HIG standard.

Imagery (empowered by BigImage)

I don’t put here a snapshot of the scan. because it looks like standard full screen picture. Additional description and demo of the BigImage(tm) technology is available at SoftServe site http://bigimage.softserveinc.com. If new labs or new PACS are available, then they are pushed to the home screen as red notifications on the tab label (like on MEASUREMENTS tab above) so that physician can notice and click to see them. It is common scenario if some complicated lab required, e.g. tissue research for cancer.

Labs are shown in tabular form. This was confirmed by user testing. We have grouped the labs by the corresponding panels (logical sets of measurements). It is possible to order labs by date in ascending (chronological) and descending (most recent result is first) orders. Snapshot below shows labs in chronological order. Physician can swipe the table to the left (and then right) to see older results.

Labs

Editing is possible via long tap of the widget, until corresponding widget goes into the edit mode. Quick single click will return the widget to preview mode. MD can edit (edit existing, delete existing and assign new) medications, enter significant sign, notes. Audit is automatic, according to HIPAA, time and identity is captured and stored together with edited data.

Continued in Mobile EMR, Part III.

Tagged , , , , , , , , , , , , , , , , , , , , , , , ,

Usability of Google Currents

This post is about digital publishing, about its usability. I used Flipboard for some time and it is good. Not excellent because of some mess on lower nesting levels, but it is a base line for other similar apps and services. Let’s look at Google Currents, I will assess them on Google Nexus S smart phone. So we have native Google device and native Google app. What result can we expect? Of course we want excellent, best of the breed! What we have in reality? Read on.

Level 0: Home Screen

We see sexy image and some icons on the home screen. Navigation is horizontal. I.e. to move between the pages we have to flip right, then left or right. Standard think on touch screens. One big surface and moving it with the finger. From aesthetics point of view home screen of Flipboard on iPad is way better. Ok, we are at the Level 0, Home Screen. And it has horizontal navigation. Take a look below:

Level 0, Home screen with horizontal navigation

Level 1: Fast Company and TechCrunch

I like to read Fast Company stuff, so I am clicking its icon. By doing that I descend to the nesting level, let’s name it Level 1. What we see at that level? Fast Company expands its content horizontally. We can navigate it as we did on Level 0, intuitively, everything the same and it is good. Looks at the snapshots below, demonstrating horizontal navigation for Fast Company:

Level 1, Fast Company with horizontal navigation

So far everything cool and we love Google Currents as an alternative to Flipboard:) But it is time to launch TechCrunch now. I love TechCrunch too, it is on my Home Screen as well as Fast Company. So, clicking TechCrunch icon! Starting to read, donw with first screen, trying to flip right and here comes an issue:( TechCrunch does not scroll horizontally. Surprise? I would call it a flaw in Interaction Design of the Google Currents. At the same level as Fast Company – Level 1 – TechCrunch does scrol vertically. How could I know it? I used to flip horizontally on Home Screen and on Fast Company. Look below at the vertical navigation of Tech Crunch at Level 1:

Level 1, TechCrunch with vertical navigation

There is strong inconsistency with content navigation at the Level 1. It is difficult to pay special attention to some widgets that serve as a hints how to scroll. Much better way is to implement same navigation direction and show no widgets as all. I could finish this post at this point, though we can go further if you like.

Level 2: Fast Company and TechCrunch

Let’s dive deeper into the content by those providers and write what we feel. At the Level 2 Fast Company resembles TechCrunch. It has vertical navigation. Below is a snapshot:

Level 2, Fast Company with vertical navigation

What we have with TechCrunch at the Level 2? Of course something different, we have got horizontal navigation. Look below:

Level 2, TechCrunch with horizontal navigation

Seems Fast Company uses more levels to structure its content. There is whole Level 3…

Level 3: Fast Company alone

There is content at this level. There are multiple entry points to it. You can get test content, or rich video content. Navigation is horizontal for both. Look below for text content:

Level 3, Fast Company with horizontal navigation

Here is video content below:

Level 3, Fast Company with horizontal navigation

Conclusion

Google Currents on Google Nexus S has severe usability flaws. There had to be some rules for content providers to stick to. As a user I do not want to care on the structures, I want content instantly, as I had it of Flipboard on iPad. With Currents things are not so obvious, which inspired me to draw a joke. IMHO the quality of Google products degrade. They repeat Microsoft’s way with issues, bugs and … market penetration. It’s sad.

Tagged , , , , , , , , , , , , , , , , , , , , , ,