Tag Archives: mobility

Wearable Technology. Part II

This story is a logical continuation of the previously published Wearable Technology.

Calories and Workouts

Here I will show how two different wearable gadgets complement each other for Quantified Self.  For the beginning we need two devices, one is wearable on yourself, second is wearable by your bike.

First device is called BodyMedia, world’s most precise calories meter. It measures 5,000 data snapshots per minute from galvanic skin response, heat flux, skin temperature and 3-axis accelerometer. You can read more about BodyMedia’s sensors online. BodyMedia uses extensive machine learning to classify your activity as cycling, then measuring calories burned according to the cycling Big Data set used during learning. Check out this paper: Machine Learning and Sensor Fusion for Estimating Continuous Energy Expenditure for excellent description how AI works.

Second device is called Garmin Edge 500, simple and convenient bike computer. It has GPS, barometric altimeter, thermometer, motion detection and more features for workouts. You can read more about Garmin Edge 500 spec online. My gadgets are pictured herein.

04_gadgets

On the Route

The route was proposed by Mykola Hlibovych, a distinguished bike addict. So I put my gadgets on and measured it all. Below is info about the route. Summary info such as distance, time, speed, pace, temperature, elevation is provided by Garmin. it tries to guess about the calories too, but it is really poor at that. You should know there is no “silver bullet” and understand what to use for what. Garmin is one of the best GPS trackers, hence don’t try to measure calories with it.

Juxtaposition of elevation vs. speed and temperature vs. elevation is interesting for comparison. Both charts are provided by distance (rather than time). 2D route on the map is pretty standard thing. Garmin uses Bing Maps.

02_map_elev_speed_temp_dist

Burning Calories

Let’s look at BodyMedia and redraw Garmin charts of speed, elevation and temperature along the time (instead of distance) and stack them together for comparison/analysis. All three charts are aligned along the horizontal time line. Upper chart is real-time calories burn, measured also in METS. The vertical axis reflects Calories per Minute. Several times I burned at the rate of 11 cal/min with was really hot. The big downtime between 1PM and 2:30PM was a lunch.

An interesting fact is observable on Temperature chart – the Garmin was warm itself and was cooling down to the ambient temperature. After that it starter to record the temperature correctly. Another moment is a small spike in speed during downtime window. It was Zhenia Novytskyy trying my bike to compare with his.

01_calories_elev_speed_temp_time

Thorough Analysis

For detailed analysis of the performance on the route there is animated playback. It is published on Garmin Cloud. You just need to have Flash Player. Click this link if WordPress does not render the embedded route player from Garmin Cloud. There is iframe instruction below. You may experience some ads from them I think (because the service is free) …

The Mud

Wearable technology works in different conditions:)

03_mad

 

 

 

Tagged , , , , , , , , , , , , , , , , , , , , ,

Mobile Home Screens

Mobile Home Screens

Better Home Screen?

Definitely without iPhone’ish glossy icons,
wasting potentially useful space around.
Want it or not, but glossy must stay in the past.
Near future is flat.

With aligned multi-sized widgets like Winphone.
With variety of widgets like Android.
But much more aesthetic than they are today!

With more information embedded into the icon-sized area,
like Facebook Home, three faces into small context area.
Ideally the home screen can delivery useful information in five-six different contexts even without any clicks on them.

Smaller widgets will remain and prevail,
because they are sized to our finger…

Tagged , , , , , , , , , , , , , , , , , , , , , , , ,

Mobile EMR, Part II

On 27th of August I’ve published Mobile EMR, Part I. This post is a continuation.

The main output from initial implementation was feedback from users. They just needed even more information. We initially considered mEMR and All Information vs. Big Data. But it happened that some important information was missing from the concept relied on Powsner/Tufte research. Hence we have added more information and now ready to show the results of our research.

First of all, data is still slightly out of sync, so please be tolerant. It is mechanical piece of work and will be resolved as soon as we integrate with hospital’s backend. The charts on the default screen will show the data most suitable for the each diagnosis. This will be covered in Part III when we are ready with data.

Second, quick introduction for redesign of the initial concept. Vital Signs had to go together, because they deliver synergetically more information when seen relatively to each other. Vital Signs are required for all diagnosis. Hence we have designed a special kind of chart for vital signs and hosted it on top of the iPad. Medications happened to be extremely important, so that physician instantly see what meds are used right now, reaction of vital signs, diagnosis and allergy, and significant events. All other charts are specific to the diagnosis and physician should be able to drag’n’drop them as she needs. It is obvious that diabetes is cured differently than Alzheimer. Only one chart has its dedicated place there – EKG. Partially, EKG is connected to the vital signs, but historically (and technically too) the EKG chart is complemently different and should be rendered separately. Below is a snapshot of the new default screen:

Default Screen (with Notes)

Most important notes are filtered as Significant Events and could be viewed exclusively. Actually default screen can start with Significant Events. We just don’t have much data for today’s demo. Below is a screenshot with Significant Events for the same patient.

Default Screen (with Significant Events)

Charts are configurable like apps on iPad. You tap and hold the one, then move to the desired place and release it. All other charts are ordered automatically around it. This is very useful for the physician to work as she prefers. It’s a good opportunity to configure the sets according to diagnosis. Actually we embedded pre-sets, because it is obvious that hypertension disease is cured differently than cut wound. Screenshot below shows some basic charts, but we are working on its usability. More about that in Part III some time.

Charts Configuration

According to Inverted Pyramid , default screen is a cap of the information mountain. When many are hyping around Big Data, we move forward with All Information. Data is a low-level atoms. Users need information from the data. Our mEMR default screen delivers much information. It can deliver all information. It is up to MD to configure the charts that are most informative in her context. MD can dig for additional information on demand. Labs are available on separate view, groupped into the panels. Images (x-rays) are available on separate view too. MD can click onto the tab IMAGERY and switch to the view with image thumbnails, which correspond to MRIs, radiology/x-ray and other types of medical imagery. Clicking on any thumbnail leads to the image zoomed to the entire iPad screen estate. The image becomes zoomable and draggable. We use our BigImage(tm) IP to empower image delivery of any size to any front end. The interaction with the image is according to Apple HIG standard.

Imagery (empowered by BigImage)

I don’t put here a snapshot of the scan. because it looks like standard full screen picture. Additional description and demo of the BigImage(tm) technology is available at SoftServe site http://bigimage.softserveinc.com. If new labs or new PACS are available, then they are pushed to the home screen as red notifications on the tab label (like on MEASUREMENTS tab above) so that physician can notice and click to see them. It is common scenario if some complicated lab required, e.g. tissue research for cancer.

Labs are shown in tabular form. This was confirmed by user testing. We have grouped the labs by the corresponding panels (logical sets of measurements). It is possible to order labs by date in ascending (chronological) and descending (most recent result is first) orders. Snapshot below shows labs in chronological order. Physician can swipe the table to the left (and then right) to see older results.

Labs

Editing is possible via long tap of the widget, until corresponding widget goes into the edit mode. Quick single click will return the widget to preview mode. MD can edit (edit existing, delete existing and assign new) medications, enter significant sign, notes. Audit is automatic, according to HIPAA, time and identity is captured and stored together with edited data.

Continued in Mobile EMR, Part III.

Tagged , , , , , , , , , , , , , , , , , , , , , , , ,

Mobile EMR, Part I

At SoftServe we do some researches, I’m in charge of them. This time I’d like to describe one of our researches. It is called Mobile EMR. EMR stands for Electronic Medical Record. This research is intended to find what will hospitals, agencies, practitioners will use tomorrow. Mobile EMR aka mEMR is a hot intersection of booming Mobility, Big Data, Touch Interface, Cloud trends. Today we research it, tomorrow they will use it. SoftServe works with multiple ISVs which build solutions for hospitals, agencies, hence we have influence on evolution of medical solutions.

Part I will be about how we started and what we get up to date, with conclusion for next steps, which will be described later in Part II.

Beginning

It has started some time ago during UX research for medical data visualization. Then famous government initiative has been launched to transfer healthcare industry to EMR. Mixing that soup we figured out that it is possible to propose really new concept for the EMR for physicians. We found research by Seth M. Powsner and Edward R. Tufte, “Graphical Summary of Patient Status” published in far 1994. They work at Yale University. One of them is MD, another is PhD in statistics and guru of presenting data. It was oriented onto All Data paradigm that Mr.Tufte loves. I love it too. I love the idea of having All Data on single pager. As soon as Apple released iPad we understood that it is perfect one pager to put the EMR onto. In 2011 I attended E.Tufte one-day course, and found a moment to speak about healthcare data visualization. Mr.Tufte confirmed there was nothing done in the industry still! He pointed to the printed copy of mentioned research and proposed to implement it. After that we had one more face-to-face contact with Mr.Tufte on that research, we god some additional clarifications and recommendations (mainly related to low level details such as sparklines). Below is a snapshot of the proposed visualization by Powsner/Tufte:

Seth M. Powsner and Edward R. Tufte, “Graphical Summary of Patient Status”, The Lancet 344 (August 6, 1994), 386-389.

All Data has to be handled by special kind of chart, that shows three periods of data. Rightmost biggest part shows week or 10 days, middle narrow part shows previous year, and leftmost part shows all possible data prior to last year. Having such data presentation we are capable to display all anomalies and trends for the whole period that has data logs.

'All Data' chart

“All Data” chart. Seth M. Powsner and Edward R. Tufte, “Graphical Summary of Patient Status”, The Lancet 344 (August 6, 1994), 386-389.

 

We were aware there was web implementation. It was 100 percent copy of the research (charts part of it). Below is a screenshot from the browser:

Web EMR by KavaChart

First Version

We took our Mobile IP and SDK (such as authentication, cache, cryptography etc reusable blocks; Big Image(tm), SaaS SDK) and built first app for iPad. Obviously we lacked deep domain knowledge, hence first version is not perfect. But idea was to do technology feasibility rather than ready-made solution (because we work with healthcare ISVs who keep deep domain expertise). There were few cycles for visualization and layout of the charts and other UI elements. As a result, we got this “first version”:

mEMR Default Screen

All remarks and proposed improvements will be listed herein a bit later! Right now I’d like to show few more screenshots what we have got. Physicians identify the patients by MRN or by name, if the patient stays at the hospital for some time. Hence, we introduced two-way patient identification: via My Patients list, and via wristband/MRN scan. Below are screenshots: My Patients and Wristband/MRN Scan. First one is My Patients:

mEMR My Patients

This one is Wristband/MRN Scan:

mEMR Wristband Scan

User Testing and Recommendations

“First Version” has been shown to MD from New York large hospital. Impressions were mixed :-O
In general – idea of such mEMR is good. But using it with its current data is not so valuable. I’ve got recommendation for data presentation improvements. They are listed in unsorted order below. The format is the following: subheader represents an issue or proposal for improvement. The text after it describe reasoning and clarifies details.

User Info

In the left top corner next to the photo we show basic patient information. It happened that both DOB and age is required simultaneously. Year serves as additional identification for multiple Johm Smiths. Ok, the information about age should be like 56 y.o. F, where F stands for female. M stands for male. Sex must follow the age, and must be encoded into single letter F or M. It should be next to the patient name. DOB should be on second line. Look below:

Photo    John Smith  62 y.o. M

    DOB 05/24/1950

In the right top corner we shown Discharged. It happened that nobody cares about discharge, while everybody needs Admitted. Hence, right below the MRN we have to add new line. MRN and Admitted record must be kept together. Look below:

MRN: 221881-5

Admitted: 08/24/2012

Those requirements are related to the integrity of  the data, what is kept with what. There is no requirements to UI layout. Current layout was confirmed as good.

Diagnosis

There is a widely used abbreviation for diagnoses. It is OK to use smth like HTN and MDs will understand it. Writing Depression is also good, but often names are long and abbreviation works better. Important information that is mission in our “first version” is Allergy. It is “the must” that complements the diagnosis. We have to allocate some space below Diagnosis and specify there Allergy. If Allergy is present, then it must be very contrast, may be highlighted in red. Look below:

HTN, Depression

Allergy: Aspirin (rash)

Medication

The patient can be currently on some medications. They must be listed on top of our single pager. Current Medication is such important as Diagnosis. We can specify it as a simple list of what the patient takes. Look below:

Metoprolol 50 daily [sorry, I can’t put units here right now]

Norvase 5 mg daily

Paroxetine 20 mg daily

Medication can be labeled as Meds. It should be in the upper area of the screen, as well as improved User Info and Diagnosis. Other info such as hospitalization and clinic visits should be kept within the upper area of the screen. Later sections describe the middle area of the screen.

I will provide typical medications for those 4 diagnosis: HTN, CAD/HLD, CHF, Asthma. Please update patients data to those 4 diagnosis only. This is our restriction for demo purposes. Medications will take max 4 lines on text, each line up to 25-30 chars.

Vital Signs

We shown then as separate charts: chart per temperature, chart per pressure, chart per heart rate. It sucks. The reason is that it does not represent information. What MDs see as information from vital signs? The relation between all signs together! Hence we must pack all 5 vistal signs into single chart so that all peaks or dropdowns are visible relatively to each other. The labeling for vital signs is the following: t for temperature, HR for heart rate, BP for blood pressure, RR for respiratory, SpO2 for something I don’t remember right now. The order of vital signs should be as listed with t on top and SpO2 at the bottom. Chart should be from sparklines, not for dots. All mins and maxs should be visualized properly. Now I understood what Mr.Tufte explained to our UX designer about use of sparklines. We though it was for all charts, but it happened it is only for vital signs. Look here for visualization spec. The chart with vital signs must be on top. Let’s keep it alwas in left top corner. If there is insufficient space, stretch it to the width of two charts.

Notes

They suck. Usually doctors put a lot of secondary info there. Hence there is no place for such irrelevant notes on the Home Screen. All doctors need is an entry point into the Notes. Doctors read them on demand. As a secondary screen opened intentionally. Implement it if possible.

Significant Events

Those are important! Significant Event is some even that is related to the life of the patient. E.g. patient fall and got head injury. In other words, non-typical events according to the diagnosis. If vomiting is common for that diagnosis, then vomiting is not a significant event, hence not worth big attention.

Probably we can redesign the right pane to show Medication on top, then Significant Events on top and in the middle, and has Notes as entry point at the bottom. If Medication takes too much space, then we can group it with Vital Signs (increased horizontally chart) and keep them together on top as a separate section. All other charts and Significant Events should be below them. Personally I like first alternative better – to use current right pane for both Medications and Significant Events.

EKG

Electrocardiology is “the must” on home screen. Add it as a separate chart, as in Tufte’s research and in web demo (see links above). It is OK to keep EKG on same place, hence keep it hear Vital Signs for example, at same horizontal level.

Imaging (aka Radiology)

This is a section of all scans. MDs call them interchangeably Imaging and Radiolody.  They have it as secondary information. Hence we have to put probably 1 or 2 images on the Home Screen, but implement separate screen with thumbnails to all imagery available for that patient. For our demo we can keep few images on Home Page, then we will see. But they (images) should be organised in some group labelled as Imaging.

Labs

Our “first version” doesn’t have Labs data at all. How Labs look like? It is a list of text and numbers. Furthermore, MDs get used to the order how to read the labs (BTW I confirm the same for vital signs). E.g. Labs called Hepatic Panel is shown as a sequence of AST, ALT, Alkaline Phosph. Labs called Coagulation is shown as PT, PTT, INR. It is common that Labs list contains 5+ sets of results (named as Panel here). Superscripts are used to represent K+, Na+, Cl-, CO3-, Ca2+ and so on.

We can allocate the space at the bottom of the Home Screen to visualize Labs and Imaging. Labs should grab more space than Imaging. As I’ve told, having 2 thumbnails as an entry point to Big Image is sufficient. Then physician should click either title Imaging or “…” next to the title, to get to the separate screen with all other images. Re Labs, they could be shown in tab format. Look below:

              date1  date2  date3

Erythr.    4.5    4.3    2.8

HGB    140    138    100

Disease Profiles

It could be useful to have layout presets for chronic diseases like Diabetes etc.

Continued on Mobile EMR, Part II.

Tagged , , , , , , , , , , , , , , , , , , , ,

Web 3.0

What is a future of the Web?

Is it Semantic Web as long time ago smtb called it? Spend few minutes to read so diverse definitions of Web 3.0 on wiki and return back here. Nobody argues with all those predictions, all of them will happen at some point in the future. My favorite prediction is smth like Kevin Kelly made public at the end of 2007, called “Next 5000 Days of the Web”

All those devices and sensors that will suck data into the web are related to our mobile devices. From the Mobile World Congress 2012 I brought information, announced by Eric Schmidt, that soon we will have 50,000,000,000 connected devices. Only imagine that number, almost ten devices per person. It is really huge!

But what we have today?

Today we see the boom of mobile apps. It is similar to what we have with the boom of apps for PCs 20-25 years ago. The history repeats itself, slightly at different level. Now we have app boom for smaller devices than PC. Years ago we have premium vendor of the app platform – Apple, and commoditizer – Microsoft. Today we have the same, premium vendor – Apple, and new commoditizer – Google/Android. But the big picture is similar, the apps are booming, there is brand new community of developers and users of them. There are new business models emerging how to monetize on this new boom.

How is it related to the Web at all? The web is in place, it is inevitable and we are all in the web, but there are nuances;) Surfing the web with Mobile Web is not the same as using the Native App. For business applications Mobile Web is logical choice, it smoothly substitutes awkward MEAP solutions. It is not a surprise that Gartner did not identify any MEAP vendors as Leaders in is Magic Quadrant. There are niche players, visionaries, but there are no leaders. It was not easy, hence many walls were broken by Mobile Web. Enterprise love Mobile Web, it has emerged and gaining popularity. Is it Web 3.0? What is a difference between web app for desktop, tablet, phone? There is almost no difference. Just few additional features like geolocation available from the browser, camera and so on. But delivery model is the same, SaaS-like familiar from PC times. Hence it is not a revolution to be named Web 3.0.

Revolution happened.

Revolution seems to be this application boom on modern phones and tablets. It smells like revolution. This observable on apps like Instagr.am. Believe me or not, but Instagram was a threat to Facebook! Initially people published photos on Flickr or Picasa and sent link to the friends and colleagues to share them. With Facebook photo sharing feature, it got simplified, you just upload photos and there got shared automatically within your network. No need in Flickr or Picasa anymore? Then came Instagram, with opportunity to make pictures with the phone, apply some cool effect and instantly share, without connecting the device to the PC and without that annoying bulk upload. Instagram has a backend, synthesized from Facebook and Twitter, which is cool for the user. You don’t need Facebook anymore to share your pictures! Bingo!

Ok, Instagram is cool, Facebook even bought it to kill it as a competitor… But were is the web there? It is called Web Services. There is very rich and powerful web, full of clouds and web services. As Jeff Bezos once said, the future of the web was in Amazon Web Services. It is. We have got very popular S+S model, with native app on the phone/tablet and back end on AWS or so. There is good report by Vision Mobile that “Apps is a New Web“, dated 2010. We have got new ways of discovery of useful things, brand new UX, new monetization models. Enough arguments to call it New Web. May be not Web 3.0, but definitely it is no more Web 2.0.

To HTML5 Believers.

Those who hope on HTML5 as a standard, and return to old good SaaS approach could be pleased that for enterprises this works even today and will work tomorrow. But for the non-enterprise users it is not a case. First of all, all standards need few years (up to 5) to mature, after that the wide adoption happens. Second, hardware will evolve too. Web technologies will not keep the pace of hw evolution. Have you ever heard about new sensors planned for the new iPhone? E.g. infrared camera patent filled by Apple recently. It will serve for DRM, like preventing from recording the live show. It will serve to identify objects by infrared tags, instead of ugly QR tags. Infrared are invisible to the people, which means they are better, because they do not spoil the look of the object. OK, back to the infrared sensor – do you thing web tools like HTML will have support for infrared camera tomorrow? I think no. I even bet it will not. The pace of hardware is fast and web technologies will be few steps behind.

Image

We have entered Web 3.0

New sensors like infrared camera will be added to the phones, tablets in the future. Other devices will emerge in the future. Recall 50,000,000,000 connected devices. There is no easy way to apply SaaS to all of them. There is strong M2M trend, observed during recent years. It is not Web 2.0 anymore. We have started from the user apps, now we are descending to the machine apps too… It is really smth brand new. I propose to call this new era Web 3.0. For semantic web we could chose another name, when it come. So far we are within smth new, and instead of calling it New Web, let’s call it Web 3.0.

Tagged , , , , , , , , , ,

WHERE 2.0, Wednesday, Part II – Healthcare & Location

Healthcare session

INTRO highlights from four panelists.

Geomedicine is emerging. Location data needed “to make medical records more enriched”. Environment is a critical player, environment is related to the geography significantly. New medical framework needed.

Place information to be brought into the story. Bring mobility into the story, as a location traces. Location over time (physical activity) matters. How I communicate chronical disease. Traces of daily lives. Pulling features out of daily traces. Looking for standardised platforms to use. Looking for mobilized innovations in healthcare.

Intersection of mobile activity data to deconstruct human genome why people do and how they do. Get data from telecoms, twitter, facebook and analyze the data. Track diseases from social and location networks. Incorporate into doctors’ diagnosis tools.

Tools for healthy behavior changes. How tools change daily behavior patterns. Places, environment, time etc matters. How to utilize technology in healthcare. Location data seems very attractive for unlocking hidden patterns in disease and people data.

APPLICATIONS of new approaches to healthcare.

Eating. If smbd prohibited to eat fast food, the app will discover smbd goes to Taco Bell and warns to not eat there. May be app will tell the doctor about that patient violating the rules of nutrition. Therapy tools. Tools keep track and make recommendations. Tool knows you spent 5h in shopping mall and how it affects insomnia. There are 20,000 toxic places in the states. 10M pounds chemicals per day produced… it is issue for environment. Public health. You are informed, you make decisions. Further: how we bring other specific points of data into the framework?

BARRIERS in geomedicine

Privacy in healthcare is a huge barrier. Overwhelm with information. He-he, BigData comes to healthcare:) Sensemaking out of the data. Profession of data scientist comes to healthcare. Ecosystem needed.

OPPORTUNITIES

Location is an opportunity. In long term – understanding human activity at the level of context to encourage doing smth. Change people awareness, people attitude. Autonomous tracking of behavior patterns. Sensors do not really working. Where do you go? Whom you contact with? You can understand people, their health related behavior. E.g. if you eat in front of TV, how often do you eat home, how
fast do you eat, time of eating, gym facilities. Interventions are bound to the context – what exactly to push to the patient. In-door locations within hospitals and clinics. Geodistance of your day. Those things impact medication. analysis of daily and weekly traces give answers. Parkinson disease is location specific. Self-diagnosis via summarizing information. For medical tools: “one size does not fit all anymore”. Different personalities, research done.

CONCLUSION

conclusion for mHealth – Public Health, Disease Management, Mobile EMR confirmed and distinguished categories (from other ~10 categories) as mHealth solutions.

 

Tagged , , , , , , , ,