Thursday, May 30, 2013

I2P: Rolling Pharmacies into the Digital Age

The Digital Age won't be more of the same, but faster, for Pharmacies.

There are three big Disruptions coming your way, necessary to address ballooning healthcare costs :
  • Dispensing 'tapes' and secure cartridges for in-home personalised dispensing.
    • In the 1970's electronics manufacturers moved to automated "pick and place" of parts. Instead of selling loose parts, manufacturers supplied them on tapes, which were loaded into the automatic machines.
    • Large automated dispensaries are currently being sold to hospitals and multiple suppliers are selling pre-packed & labelled medications to nursing homes. The problem for both these approaches are that most patients aren't in these locations, especially the elderly who are likely to miss doses or multi-dose, leading to potentially fatal consequences.
    • Home dispensing machines will need to read identity bracelets and have one or more biometric detectors (as simple as a microphone or video camera+face recognition) to confirm who is being dosed and to log dispensing.
The "magic sauce" for Pharmacists is becoming the Technology and Patient Monitoring gatekeeper, or "goto guy" (or persons for those that like that language).

The reason Pharmacists can take this ground is simple and compelling:
Doctors are too busy, nurses see people and know if they are maintaining their health, but only Pharmacists know in detail the effects of drugs, non-drug treatments and adjuncts and the many subtle warning signals. Plus, people generally like their pharmacist and feel confident in talking to them.
Integrating these three Disruptive forces allows additional hihg-value services, like changing dosages in response to new readings.

Monday, April 1, 2013

I2P: Things I've been reading

This newsletter is for Busy People, so this month I don't offer opinions, but articles / recordings I found interesting that you can dip into as time and interest permit.

Topic 1.

What happens when a good academic/researcher finds himself in debate with Climate Skeptics?
He doesn't pretend he's expert in Climate Science, he applies real science in his own disciple. A reminder of what Good Science looks like.

Meet Prof Lewandowsky from UWA and one of his papers "Recursive Fury".

A good quick intro, includes an excellent 3min video by the Prof from ConversationEDU [direct link below]
http://www.quadrant.org.au/blogs/doomed-planet/2012/09/uwa-s-scourge-of-sceptics

"Climate change denial and the abuse of peer review" [02:57]
http://www.youtube.com/watch?v=s4GUMMx4sK8&list=UUQfzgoCt72pQNpkvnHjNvog&index=25

"A journey into the weird and wacky world of climate change denial" [03:09]
http://www.youtube.com/watch?v=C8wVfxoPqPA&list=UUQfzgoCt72pQNpkvnHjNvog&index=24

An intro from a science show late 2012 [long]
http://www.abc.net.au/radionational/programs/scienceshow/climate3a-who-denies3f/4381756#transcript

An overview of "Recursive Fury"...
http://www.skepticalscience.com/Recursive-Fury-Facts-misrepresentations.html
Conspiracy theorists are those who display the characteristics of conspiracy ideation
Recursive Fury establishes, from the peer-reviewed literature, the traits of conspiracist ideation, which is the technical term for a cognitive style commonly known as “conspiratorial thinking”. Our paper featured 6 criteria for conspiratorial thinking:
1. Nefarious Intent (by conspirators): they're out to get us.
2. Persecuted Victim: Self-identifying as the victim of an organised persecution.
3. Nihilistic Skepticism: Refusing to believe anything that doesn’t fit into the conspiracy theory.
4. Nothing occurs by Accident: Weaving any small random event into the conspiracy narrative.
5. Something Must be Wrong: Switching liberally between different, even contradictory conspiracy theories that have in common only the presumption that there is something wrong in the official account by the alleged conspirators.
6. Self-Sealing reasoning: Interpreting any evidence against the conspiracy as evidence for the conspiracy.


Topic 2.

This program on Radio National is a good listen, it's on a 35-yo paper that took a fresh look at how Medicine really happens.

THE SICK MAN IN MEDICAL COSMOLOGY [27min]
http://www.abc.net.au/radionational/programs/bodysphere/the-sick-man-in-medical-cosmology/4592966

In 1976 the British sociologist Nicholas Jewson published a paper called "The Disappearance of the Sick-Man from Medical Cosmology, 1770-1870."

Dr James Bradley
Lecturer in History of Medicine/Life Science at the University of Melbourne


Topic 3.

Letterman this week rebroadcast an interview (Jan, 2013) with Al Gore, notionally about his latest book, "The Future".
It ranged over many topics and shows Letterman can be a tough interviewer. Letterman thought the situation looked hopeless in parts, but they both agreed that part of the cause was the capturing of Politics by Big Money interests: politicians play to those who fund their campaigns, not those who elect them.

This came on the back of reading a "Rock Centre" piece on Dr Eric Topol about "Apps" revolutionising medical-care, where Dr Topol sums up the root of the persistent, systemic problems:
Doctors have NO incentives to reduce costs.
The Letterman interview sparked this thought:
The World is in the grip of a long-term Healthcare crisis with spiralling costs and falling patient outcomes: where is the equivalent to Al Gores' "An Inconvenient Truth"?
There are not only increasingly issues around Patient Safety, Quality Improvement, Process Improvement and Improved Cost Effectiveness, but serious questions about the effectiveness of Professional Societies, Medical Boards and supporting legislation for criminal action.

Queensland Health is again in the news directly after the acquittal of "Dr Death" of manslaughter, a senior investigator is suggesting up to 100 doctors should be investigated for criminal negligence, but just six are being re-investigated.

Directly following the wide-ranging Commission of Inquiry there, sparked by the deaths in Bundaberg, how can this be so? How can the Queensland AMA be hosing this down, not outside Parliament mounting a strong protest?

Who is looking after Patient interests and making individuals and organisations accountable for this actions? It seems only to be the occasional media story, nothing more.

This lack of Organisational Learning and Improvement fails my Rubric of Professionalism, yet seems acceptable to "those in Power".
It's is NOT Professional to repeat, or allow, Known Errors, Faults and Failures.
We wouldn't accept this in Aviation, or even railways, so why does Healthcare get to slide by?

Thursday, February 28, 2013

I2P: Using the Cloud for Professionals.

Suppose you want to write a paper or do some research with a group, or simply have an interactive conversation about improving your practice or bettering your business. How would you do that?

Here's my story of what's possible and how to get there...

The stock-in-trade for professional programmers is not writing code, but dealing with people and the complexities of jointly constructing or maintaining large, complex and invisible artefacts.

As a software professional, I've worked on projects and systems in the 1-2M Lines of Code range, with 50-100 coders. These are often considered "large", especially in my discipline, real-time technical programming.

To produce anything requires process, tools and discipline, but mostly automation of the important tasks. It's beyond human capability for 3 people to manage a single project without significant errors and problems or being forced into rigid compartmentalisation and code isolation.

The largest, most complex and challenging codebase know is the Linux Kernel, now 21 years old is by people distributed around the planet and who mostly never meet:
  • 38,566 files of 15,384,000 lines
  • 2,833 developers from 373 companies
  • every day, 10,500 lines added; 8,400 lines removed; 2,300 lines modified
  • A staggering 5.79 changes per hour, 365 days a year.
This work relies on the Internet and automation. Things must Just Work.

This is the largest, most active distributed collaborative project every known, Software or not.

How do these folk do this, day after day, without serious problem?
What's the secret sauce, the magic toolbox, that allows them to do all this?

To handle anything so large, complex and fast-changing would have to need amazingly complex and difficult to master tools, wouldn't it?

No
.
Just two, but only because they're based on the Internet and Open Source Software and another secret:
  • A distributed Version Control system, now "git" stores the changes to every file in a single "Code Repository". Other projects use "subversion" and "CVS". The Repositories are replicated and backed up using "Cloud computing" principles.
  • A common "toolchain". The compilers, linkers and analysis tools plus "make", the command that knows what-depends-on-what and how to compile any and all parts of the source code into an executable file or "binary".
The other secret is:
All exchanged files have very simple structures, usually plain text files.
That's it, the recipe we know that scales to over 3000 simultaneous users and 15M lines (and the same or more again in documentation) - when printed, 150,000 pages, or the proverbial "1000 ft high" pile.
Text source files, a shared Code Repository run over the Internet on Cloud services and a common set of build tools.
Apart from learning the language, the compiler and each of the programs in the "toolchain", (and reading the code!) what's the overhead and training required to get into kernel development. Surprisingly low. I'm not sure why every 2nd year computing or software engineering student doesn't have this as a course requirement to progress to 3rd year. Any 1st year student could probably do it, every 2nd year student should be competent in these tools, as much as reading reference books or using the library.

There is a very readable 10 page paper, "Submit your first kernel patch", that I encourage you to read. I don't expect you to understand any of the code or the incantations recited to do "magic".
But you will understand:
  • the process is simple and well defined, and
  • the central tools, the version control system, is very, very simple.
So, you're a working Healthcare Professional and you want to write a paper, conduct some research or collaborate with your peers. How is any of this relevant to you? That's software, not research or document writing, isn't it?

What I didn't say is that all documentation and diagrams are stored in the Code Repositories as well. They're probably larger in size. The same rules apply: simple text files, common tools, known process.

You have Microsoft Word, an email account and a PC. How hard can it be?
MS-Word has some very nifty version control and you can review and merge/reject updates from multiple authors. So everything you need is sitting right there in front of you, isn't it?

No.

I recently went through exactly this "old-school" manual process. I started with an Apple word processing program, converting to '.doc' and then to OpenOffice format because we could all read that. Formatting was a mess, even though I'd used barely more "markup" than I use on a webpage. It took me many hours to get it readable, not close to good.

Although I'd said "the document has ONE owner, responsible for updating it", that rule was soon broken and old versions were updated and sent around.

The rule I'd proposed, "give every copy of the document a unique name" (your name + date/time), failed as well.

While there were only 3 of us collaborating on a 20 page document, we lost a substantial number of edits while wasting a rather large chunk of time doing it. Our submission was not nearly as good as it could've been and it took rather more time and effort than it deserved.

For the next iteration, we used Google Docs. It isn't designed for full-on "Project Collaboration" like the defunct Google Wave, but it is very useful to us.

GDocs has two features that enable real-time multi-author Collaboration:
  • version history, allowing you to undo changes back to a point-in-time. You can't merge/reject all changes, but its way better than nothing. Importantly you know "What got changed?"
  • In a normal browser, simultaneous access and editing. You see the names of all other people with the document open for viewing or editing. They're given a unique colour, and what they type, or even just 'select', gets highlighted.
Our next document sped out the door because we used Google Docs.

You can take an existing document and upload it to GDocs, then have it convert it to one of its own files that you can collaboratively edit. At any time, you can download, right there from your browser, a copy of that document in a variety of useful formats: PDF, ".doc", OpenOffice and as a plain text file.

This is everything you need for document collaboration, even with large, distributed groups.

The lesson from 20 years of work on the Linux kernel is that you don't need expensive, complex and cumbersome tools to succeed in difficult tasks. Just the right number of simple and reliable tools.

Google will sell your business a complete on-line replacement for all your Document Processing needs. Everything lives "in the Cloud" and you can access them from anywhere with a browser and password.
Pretty neat and appealing, eh?!?!?!

While nothing beats the "Documents in the Cloud" model for some uses, to me it seems exactly wrong for handling all internal documents. The least concern is security. An ex-employee or a hacker can get into all your files...

But mainly its the distinction between ownership and control. While you still own all your content, you no longer control access to it.

If for some reason Google goes down (it's happened) or you lose Internet connectivity (there's a thousand "moving parts" between you and your data, literally), then your business is fried for the duration...

Please understand what I'm saying: For some uses by all businesses, "Documents in the Cloud" is a perfect solution, for all use by some businesses, the dependency on the network and service provider will result in severe, even catastrophic, business impact.

I know there are competitors to Google Docs, I have had no need to go look for them and use it as an example only. It worked for me, but Your Mileage May Vary.

One very worthy product/company that worth looking at for personal use and collaboration, though I haven't tried it, is Evernote.

What's special about them is they allow you to capture text, sketches, even recordings (from anything, e.g. Pen recorders), from any platform: PC, laptop, iPad, Android tablets and smartphones, then they allow you to tag your data and search it.

Modulo the warnings about not having a local copy of all your data, Evernote seems to be setting up to fill the valuable niche the Filofax then PDA/Palm Pilot/Blackberry once filled: All my notes together.

That is important for Professional practice and for Collaboration.

Wednesday, February 13, 2013

When Less is More in Healthcare Spending: The "Region of Reverse Command"

A letter [30-Apr-2012] to an administrator inside "Healthcare Improvement".
It is related to a previous post: The Unnoticed Crisis in Healthcare.





I was hoping you could tell me if there have been any discussions amongst Healthcare Professionals about an effect known in Aviation as "Region of Reverse Command" or "being behind the power curve".

Hospital Blame Game: Fixable or Just Not Possible?

A reaction to an ABC report on 10% budget cuts in Victorian hospitals.

"Federal and State governments in hospital cuts blame game"
http://www.abc.net.au/7.30/content/2013/s3688997.htm

There are 3 systems effects that as yet have been ignored by Hospital Administrators and Politicians:
  • "stitch in time" funding to avoid increasing total costs by over-waiting
  • reduce budgets by dropping the least vital work. "cut floors, not corners"
  • "Don't throw good money after bad", reduce spending where patient lifestyle affects outcomes and they won't change.

Saturday, January 26, 2013

Computer Security for Business Continuity in Healthcare-related Businesses

If you run a Healthcare-realted Business, things changed in the last 6 months...
Ransomware is set to boom [0] and cyber-security is now part of our National Security Plan.
Upd: Gartner has a report on CyberInsurance. Mandiant will give you complimentary copy.

Businesses now have to secure their computers and data just as they secure their premises and goods.

Ask yourself this: "If my computers were destroyed, how long could I continue the business? At reduced capacity or at all?", then act accordingly.

The Internet is defined by its explosive growth: A few For-Profit hackers have noticed Business Ransomware is an ideal way to monetise remote computer attacks & exploits.
Expect these attacks to double every few months now. In a year they will be endemic.

Every business that can raise $5,000 and relies on its systems and data for daily operations is now in their sights.