Author Archives: admin

More cuts!

The Ontario government recently announced another 1.3% cut to physician OHIP billings applied across the board, following cuts that happened in January of this year. I’m happy to see this has been hotly discussed in the news, and online. Dr. Ming Yu’s article in the Huffington Post, for example, has gone viral (according to the OMA). Most of the current debate, as far as I can tell, centers on the plight of doctors – are we are being undervalued and disrespected, or maybe we are too wealthy and entitled?

There doesn’t seem to be much argument about the discrepancy between projected growth in health expenses and the increases in funding to health care in the budget. Whoever is making the estimate, the consistent point seems to be that anticipated growth in healthcare expenses will exceed increases in government funding of health care. We can spend a lot of time talking about how much doctors get paid, but it made me wonder, if the government is willing to make this potentially very unpopular move (if comments on Kathleen Wynne’s Facebook page are a reliable indicator) what does that say about their other options – and the current state of our health system and economy?

I couldn’t sleep the other night, and here is what I found out: the Fraser Institute published “Ontario’s Debt Balloon: Source and Sustainability” in February 2015. To sum it up, it says the source of Ontario’s debt balloon, which reportedly expanded from 28% of the provincial economy in 2008/09 to an expected 40% in 2014/15, is mostly due to high operating expenses, and that without significant changes, the level of debt is unsustainable, i.e. the province will default on its debt at the rate it’s going. Ontario’s credit rating was downgraded in 2012. The Council of Canadians published “A Difficult Road Ahead: Canada’s Economic and Fiscal Prospects” in 2014. It concluded: “If spending on health care increases at rates close to the pace recorded over the past decade, Canada’s provinces and territories will have to raise taxes to avoid deficits growing even larger.” “If Canadians don’t want to pay higher taxes to cover surging expenditures on health care, the only other option will be to cut spending on social programs and education”. Blame our current woes on mismanagement by our government, but it seems to be a nationwide issue.

This is a selected representation of sources; maybe other authors have reached different conclusions about our state of affairs. However, working with very basic assumptions it seems possible to reach similar conclusions. In a big-picture sense – and I think these observations would be accepted by a casual observer – the population is aging and health care costs are therefore rising. For the same reason, revenue (and future revenue) is shrinking. Furthermore, economic growth on a worldwide level cannot continue indefinitely as it involves extracting finite resources from the planet. The idea that the province is in massive debt is another extra detail. Therefore, we are going to have major problems sustaining our current way of delivering health care regardless of whose fault it is (and maybe it isn’t anybody’s fault), and maybe what is happening now to physicians is a sign that we finally can’t keep throwing more money at the problem, because the credit line is maxed out.

To the doctors posting in online forums that they feel undervalued and they are thinking about moving elsewhere, so the government should pay us more – I think we need to grieve, and then we also need to move on, for our own good. To put it another way, if you see steam coming out of your faucets you should wonder if maybe your house is burning down instead of worrying about what is wrong with the tap. We could diversify our income streams, cut expenses, reduce our lifestyle, even look at overseas options, however one wants to cope in the short term. Then, we can move on to deciding if we should be advocating for cuts to other social services, convincing the public they need to pay more taxes, abandoning the idea of universal healthcare,  coming up with our own proposal for rationing health care, or some other plan of action to dramatically increase efficiency. Unilateral cuts aside, we’re all citizens and we’re going to need health care someday.

Even if I don’t agree with the Health Minister’s actions, I can certainly thank him for getting me (and other physicians all across the province, and the public) more engaged in thinking about the situation of health care in Ontario. If not for the repeated pay cuts I probably wouldn’t have turned onto the fact that big problems have been brewing for quite some time and that we really need to do something about it.

To be fair, I think doctors and the public are thinking about the bigger picture. The public discourse seems to be focusing mostly on the (relatively) smaller issue of physician compensation, but I imagine people are thinking about the larger context. We are likely to face some difficult questions as a society – are we going to spend proportionally more on health care, or are we going to accept that health care needs to look different than it does now, and different in what ways? How much can we gain by making health care delivery more efficient? If we can’t make up the difference by increasing efficiency, what do we cut? How little we can get away with paying our professionals before we really start hurting from it? Which ones can we really do without? If we have to start cutting services, how will we decide which ones? Can we go further down the road of the two-tier health, shifting more services into the private sector for those who can pay for them?

I haven’t heard anybody saying that universal public health care is unsustainable and dying a slow, inevitable death, and I hope it’s because that’s not true. The conspiracy theory loving part of me can’t help but wonder if all the hand-waving about cuts to physicians is a distraction from the larger issue, because the political will to confront the difficult questions doesn’t exist and whoever brings it to the attention of the public is going to look bad. On the other hand, to echo Dr. Gail Beck’s sentiment – it is a good time to be a doctor in Ontario. I would add, maybe there hasn’t been a better time in a long while. There’s nothing like a crisis to put into perspective what really matters.

OSCAR and survivalists

It has been a bit over a year since I started using OSCAR and coming up on a year that I’ve been with my current OSCAR service provider. I’ve learned a few things along the way that I thought were worth sharing.

The first topic I thought I’d tackle is something I’ll call EMR survivalism, which I believe highlights some important aspects of the psychology of OSCAR. According to Wikipedia, survivalism is “a movement of individuals or groups (called survivalists or preppers) who are actively preparing for emergencies, including possible disruptions in social or political order, on scales from local to international.”


OSCAR users – looking to survive catastrophe? Photo from

I have noticed this kind of survivalist mentality can to be one factor that draws people to OSCAR. From my perspective (and please note I’m not trying to paint all OSCAR users with the same brush, here), the prototypical OSCAR survivalist is an individualistic physician who is looking to maintain control over clinical infrastructure and patient records – in essence, one who believes that the physician bears ultimate responsibility for the care provided and therefore must also strive to maintain complete control over their clinical infrastructure and patient records. Such an individual may be suspicious of abuses of power by large corporations, inappropriate intrusions into clinical decision-making by administrators, government snooping and surveillance, and potentially preventable disasters / outages that disrupt clinical work. From the perspective of a survivalist, OSCAR seems very attractive – allowing one to maintain control over patient data and to prepare for the worst – power outages, networks going down, hardware failure, theft, EMR provider failure – at least in theory.

Does it actually work that way in practice? My best answer so far: sort of.

Does the average physician with some inclination to learn about computers know enough to effectively maintain server hardware, set up a secure network, and monitor for intrusions? I suspect not – at least not as much as a specialist would know. Also, it’s ongoing maintenance and a mental diversion that takes us away from doing what we’re good at (and what brings in the money) – looking after patients.

Is it cost-effective for the average physician or small clinic to retain in-house IT support? I would say no.

Does the average physician have access to the infrastructure needed to maintain EMR functionality in the face of unanticipated disruptions – redundant and separate internet connections with sufficiently high upload speeds, independent power supplies, ready access to replacement components in case of hardware failure, backup server with automatic switchover in case of primary server failure? Possibly, but it’s not very efficient to set all of that up for one individual system. It would probably be much more cost-effective (and environmentally friendly) to pool resources and share server space in a dedicated, secure server facility – in other words, a cloud solution.

So, why go through all the trouble of setting up an in-house OSCAR solution at all? I believe it’s the same reason why people buy farmland in the country, stockpile supplies, and build bunkers. My hypothesis is that the kind of people who worry about a zombie outbreak, nuclear attack, and foreign invasion are similar, in ways, to a subset of the people who use OSCAR.

Fallout shelter

Are OSCAR users the kind of physicians who would build a place like this? Image from

There are arguments for and against this way of living; in the event that nothing happens, it seems more efficient not to worry and trust in the people in charge. In the event of catastrophic failure in social order, though, the people with canned goods, medical supplies, and ammunition lining their basements look like they had a good idea. I suppose that whether or not it was “the right thing” to set up your own OSCAR server looks different depending on whether you’ve had years of reliable service from an EMR provider or you bought into a lemon and the company went belly-up or just didn’t pick up their phone when you needed them, leaving you stranded.

Is there a third way? I think there might be – and it comes down to trust and cooperation. In the event of a world-ending scenario I’m inclined to think that the people who prepare but also cooperate with others stand the best chance of doing well. Blind trust and obstinate self-sufficiency are two extremes, and let’s face it – not everybody is going to have a country estate with a bomb shelter to fall back to. So, too, with OSCAR – by sharing resources, hardware, and infrastructure, individual physicians could have access to more reliable EMR service with lower overhead and possibly better security and maintenance expertise, without needing to put the keys in the hands of institutions or companies that may or may not come through. I’m talking about a co-op situation – physicians pooling their resources to run their own OSP (Oscar Service Provider), hire support staff, and put together a server farm. At the end of the day we are responsible for the work we do and the people we hire, so I think we need to know at least something about the computer systems we use, as well. However, we don’t have to do it alone.

As far as I know, the OSCAR co-op doesn’t exist yet. So for now, unless a doctor can do everything independently, the next best option is to find an OSP they trust, and for the extra-paranoid among us, keep a backup system handy.

OSCAR Crowdfunding

Today I found out through the OSCAR EMR mailing list that there are a number of projects open now for crowd funding, including an upgrade to the prescription module and a billing module update. I always thought the billing module for OSCAR 12.1 is a bit clunky – it works, but (as far as I know) only supports billing through OHIP and I would like to see some ability to track bills submitted to outside insurance companies or to the patient directly. If we all chip in a little bit, it will become a reality:

OSCAR EMR Crowdfunding projects

Practice update

My secretary, An, will be away until January 24. In the meantime, I will be handling my phone and fax machine (just like old times!) If you call the office it is likely that nobody will pick up, but please leave a message and I will endeavor to return your call promptly. I will be able to access my messages on weekdays even when I am not in the office.

An used to give appointment reminders by phone – I will not be able to do this while she is away. Automated email appointment reminders are high on the priority list but I have not gotten there yet – still working on getting electronic fax capabilities for consultation reports and switching to OSCAR-based billing, behind the scenes.

The office will also not be accepting new referrals for the first few months of 2015, to allow me to work through the waiting list, which is already several months long.

Thanks, and happy New Year to everyone.

EMR Hardware part 2 – network connections

As I discovered in my quest to set up a functioning EMR, an electronic record does not function with a computer alone. In order to work, a client computer is needed to access the server, it needs to connect to a network, and there are considerations to be made for security and reliability. As the topic is broad, today I’ll focus on what I learned about network connections, and discuss client computers, security, and reliability later. Perhaps a little bit of background will make the details relevant to networking clearer.

For the total beginners, EMR software usually runs on a server computer; the user interacts with the software using a client computer that connects to the server, rather than operating the server computer directly. This is similar to how your computer is accessing a server to view this webpage. The nice thing about OSCAR is that it runs through a web browser; therefore, provided the client computer can connect to the server, any computer with a web browser can be used to access and operate OSCAR.

There are different ways the client computer can connect to OSCAR. One option is over a local network – the computers in the office connect to each other, but not the wider Internet. Therefore the server is located on-site and remote access is not possible. Another option is to connect through the Internet, which requires that the server be connected to the world wide network, but it could then be accessed anywhere with an Internet connection. One could connect to OSCAR locally while in the office and over the Internet while at another location. Alternatively, OSCAR can be run on someone else’s server (Application Service Provider, or ASP). This last option is not really DIY – the server is in someone else’s hands, in a physical sense and in terms of the upkeep. Since I needed remote access but I wanted some control over the setup, I opted to set up my OSCAR server at a central location so that I could access it at the various clinics where I work.

If one only plans to access the server locally (a very secure, but less convenient option), then a simple network switch connecting the computers should suffice. For remote access, Internet connections get involved and you will need a router.

When considering the Internet connection, the upload speed is very important. The server will be serving up files to the client computer and therefore, especially if multiple people will be working on the EMR at once, the server needs a fast upload connection. The problem with the average high-speed Internet connections is that it is biased towards fast download speed so that users can watch movies, download music, etc. Uploading is much less of a priority, and understandably so – if everyone were running a file sharing service or a web server at home it could suck up bandwidth pretty quickly. If the upload speed is too slow, it bottlenecks the server. I’ve been told that an upload speed of at least 3-5 Mbps is required and in my experience, is sufficient.

The router is basically a small computer that directs internet traffic. You will need a router if you have more than one computer connected to the Internet. There is a huge range of routers on the market – ranging in price from $20-30 to several hundred dollars.

Since my home Internet provider uses a dynamic IP, the router needed to support Dynamic DNS (Domain Name System). The computer’s IP address is like its postal address on the Internet; when one enters a URL into the browser, a DNS service uses the URL to look up the correct IP address and directs the client computer there. If the server has a static IP address, then it’s “location” on the Internet is always the same. If, however, it is a dynamic IP, it might change from time to time – a DNS service’s information might go out of date and then the server would be impossible to find remotely. The solution is Dynamic DNS – the router gets the server’s current IP address at certain intervals and feeds that information to the DDNS service, so even when the IP changes, the same URL will get you there.

There are many DDNS services available, for example, subscription services from Dyn. If you are really on a budget offers a free DDNS service; you may need to modify your router firmware to use this service (see below) as the commercial routers that I’ve owned tend only to support a few of the larger subscription DDNS services.

There are commercial-grade routers that have more advanced wi-fi encryption, virtual private network support, and faster connections, but in my experience it isn’t necessary to pay for most of these features. Given the slow upload speeds of most connections, a Gigabit router isn’t necessary – a standard 10/100 Ethernet router, even if it tops out at 100 MBps, will not be the limiting factor. Almost any old $20 second-hand router will do; most of the “advanced” features on the more expensive routers have to do with the software installed on them, rather than the hardware, and the software can be altered. This means that if your cheap router doesn’t support DDNS, the firmware can often be replaced with DD-WRT, an open-source router firmware, that does support DDNS. When my D-link router died after about 5-6 years of service I replaced it with a second-hand Linksys WRT54G (first released around 2002) and flashed the firmware so I could use DDNS. The instructions for doing this are widely available on the Internet; it takes a few hours and some anxiety is involved due to the potential of “bricking” (rendering inoperable) the router if the procedure isn’t followed properly.

Now that we’ve discussed the connecting hardware, we can look at the computer that will do the connecting – the client computer.

International Day for the Elimination of Violence Against Women

This post comes a day too late – this really should have been posted Nov 25, which is the official day.

A few years ago I was approached rather aggressively by a volunteer at a hospital, who was pushing a white ribbon on me. “Where’s your ribbon?” he demanded. It was on my other coat, I replied, which was actually true. He then insisted that I take another one, implying that I should be ashamed as a man not to show solidarity with others working to eliminate violence against women.

This incident left a lingering bad memory – I’m still of the opinion that participation in a cause that comes under the threat of shame or humiliation isn’t really valid participation – but a few years later I’ve come to see the relevance of the white ribbon, and I think I also know why I was reluctant to put it on. Violence in general, and specifically against women, is scary and horrifying. It is easier not to think about it, maybe because it occurs too often, and the perpetrators are not so easily identified as someone “other” than us (not only soldiers, criminals, gangsters, but ordinary people). Even one woman affected by violence is too much, though, because violence against women and how we choose to handle it are choices and therefore violence can be prevented and changed. As physicians, we all know someone who has been affected, and the mental and emotional impact can be profound.

Too often we do not call it what it is; a rape is not called a rape – it is labelled something else, like a “sexual assault”, which is not incorrect but also is much more vague a term. We may talk about criminal harassment as “unwanted attention”, which might be accurate but doesn’t convey the sense that it is also damaging, illegal, and morally wrong. Maybe as men we have pursued someone a bit too much so we are reluctant to see ourselves as potentially being abusers, or we fear causing offense by using a word with shameful connotations – so we use euphemisms.

How can a person be treated for an illness if the illness is never named and identified? It is uncomfortable, certainly, to tell someone they have cancer. But we would never think to leave the diagnosis unnamed, and just give them chemotherapy anyway. I have noticed the temptation in myself, however, to treat depression and PTSD that are consequences of violence and gloss over the cause. Minimizing the violence sends a message, overtly or covertly, that what happened was somehow OK, and I don’t think that’s helpful for the individual or for our society. I could be doing better with this, and the white ribbon is a good reminder.

EMR Hardware part 1 – to rack, or not to rack?

Having picked OSCAR as the software for the EMR experiment, I set out to find suitable computer hardware to install a trial version of the software on. My plan was to set something up initially as proof-of-concept and give the software a test run before making a commitment.

As an overview, a few pieces of hardware are needed to make an EMR work. There needs to be a server computer that runs the software, a client computer that allows the user to access the software, and network connections in between them. Other accessories, like battery backup, seem not to be essential, but are very important – these will be discussed in a later post. Today we’ll take a look at the server hardware.

A perusal of the OSCAR user’s manual (available online) suggested that for a single-person clinic, a typical $500 desktop PC would likely be sufficient. The OSCAR service providers seem to routinely provide Mac Mini computers (or something comparable) for about $1000. I did find an anecdote on the PEI OSCAR blog that it would be possible to set up OSCAR on an old user-grade PC computer (see the link for an excellent breakdown of the potential costs involved in setting up OSCAR). The idea is to use a computer that is obsolete for most people’s purposes, but still working. It seems that people do this all the time to run web servers for online games or serving up webpages – why not use it for OSCAR, browser-based EMR?

In my experience with OSCAR I have yet to see any source of information that compares performance of the software running on different server systems, so there is no way to know for sure (at least as far as I know) how little one can get away with in terms of buying hardware. I’ll provide my experience here in case it helps anyone else making a hardware decision.

I ultimately decided to use rack server hardware for creating an OSCAR server. I was able to find a first-generation IBM x3650 rack server on Kijiji for $250. It has an Intel Xeon 3.0 GHz dual-core processor, 4GB of RAM, 8 x 73 GB hot-swappable hard drives, a hardware RAID controller, dual gigabit ethernet adapters, and dual hot-swappable power supplies. This was surely hot stuff in 2004, but by today’s standards, it is pretty dated (when a pretty basic desktop computer comes with a 1 TB hard drive). It was a good computer to experiment on, and so far, it my experience, it has enough power to do the job.

IBM x3650, Gen 1

The x3650, with face full of hot-swappable hard drives

From an efficiency perspective and what I managed to teach myself about computers, even an old rack server would have enough processing power to handle requests from one or two users, with room to upgrade to multiple users if needed – after all, this is what they were designed to do. There is minimal requirement for a graphics processor, so the server, which doesn’t have a fancy video card, does fine here and one would not be paying for extra hardware that won’t get used. Theoretically, the server features like multiple hard drives arranged in RAID speed up the time needed to access data; for an application like OSCAR I’m not sure it really matters. What I can say from my experiences is that this machine was fine for me – one physician with a medical office assistant.

From a maintenance and reliability perspective the server does have advantages over a converted PC. For one, there is a layer of redundancy. Parts that commonly fail (fans, hard drives, power supplies) come in pairs. Lightpath diagnostics allow one to tell at a glance if one of these components has failed, and replace it, which is designed to be easy, to the point that many of the vulnerable parts are hot-swappable (they can be changed out while the computer is still running). By setting up multiple hard drives configured in RAID (Redundant Array of Independent Disks), the drives can be set up so that the contents of one are mirrored onto another so that if one hard drive fails, the data are still safe on another disk and the array can be restored by replacing the failed drive.

In contrast, the Mac Mini, which seems to be somewhat of a current standard in OSCAR computers for solo practitioners, is almost not user-serviceable at all. If something breaks, if one is able to repair it at all, it is a complicated job that involves delicate dismantling, and the more likely outcome is that it will need to be taken in to Apple. For mission-critical equipment, it seems important to be able to fix it quickly. Server hardware is also (at least theoretically) built to run 24/7, designed with reliability in mind, which may not apply to many inexpensive desktop computers.

Is server-grade hardware overkill, though? For Internet retail sites where every minute of up-time translates into a dollar value in sales, hot-swappable components are probably valuable. Can an outpatient psychiatrist function for a day without a medical record? Maybe. My experience says that after a while, it starts to become mission-critical – how do you manage if you don’t even know who is scheduled to come for the day?

From an operating cost perspective, power consumption is a consideration – I have not done the measurements myself, but from my research, an old rack server like the x3650 runs hot, has a lot of fans, and uses considerably more electricity than a little, efficient Mac Mini. In my building electricity is included with rent, so the cost to me is no higher than for any other computer I may have chosen to use, but the environmental costs are probably higher than necessary. If a component fails in a rack server and needs to be replaced, it is cheaper than buying a new computer, as might need to be done with a Mac Mini. (Although at $250, one could just buy another rack server!)

There are two other important things I learned about related to rack-mounted servers that are easy to know from experience but not from reading the Internet. The most important point is noise – this is not a widely advertised property of rack server hardware but it is an important consideration. I was told that a 2U server like the x3650 is much quieter than a 1U model because it is larger, and can therefore accommodate larger (and therefore quieter) fans. However, it is still loud – compared to a desktop computer, the x3650 sounds like a jet turbine when it fires up and becomes only slightly quieter afterwards. I would not recommend putting one of these machines where one works – especially if one intends to talk to people. Clearly these machines are designed to be housed in separate, purpose-built rooms.

Server rack for the OSCAR server.

The x3650 racked up in an XRackPro2 noise-insulated server cabinet. Even with the insulation and glass, this thing is loud!

Another important observation is the size of a rack-mounted computer. It doesn’t look imposing from the front, but the real bulk is in the depth. It is much deeper than one might suspect from looking at a picture – about 3 1/2 feet. Mounting it inside of a rack cabinet further increases the required amount of floor space. This is not a setup that one can easily tuck away inside a closet or under a desk. While the server does not need to be mounted in a rack – it could be left on the floor or standing up on its side (using a special mounting kit), there are some reasons for doing so. Mounting inside a lockable cabinet provides some security features, preventing unauthorized physical access to the machine. If the machine will be anywhere near people, a sound-insulated cabinet can reduce some of the noise. It also just looks better. The problem with server racks is that they are either monstrous (full-sized 42U racks can be found retired from data centres and cheap on Kijiji, but they might be 7 feet tall and need two people to move) or expensive (under-desk models are in the $500+ range). What I didn’t realize before I bought the computer is that many of inexpensive 6 or 8 U server racks available from the neighbourhood computer store are actually not deep enough to mount a rack server computer. Rather, they must be designed with other hardware in mind, such as audio equipment. I ended up buying an XRackPro2 – not cheap, but it provides some sound insulation and it is lockable. At the end of the day, though, it is still louder than I would like, and it is not easily movable.

In conclusion,  my experience with the rack server was a bit like owning a vintage motorcycle. It’s fun to set up and tinker with  (and blog about), and the design appeals to a certain manly sensibility, but after the initial thrill wears off, one wants to trade it in for something quieter and more energy-efficient. If I were to do it again (set up a cost-effective EMR that could be done with minimal expense for someone without external funding), I might elect to use second-hand rack server hardware, but only if there was a dedicated, separate space to set it up in, and it was likely to see heavy use where reliability is an important factor. Otherwise, the benefits in redundancy and ease of servicing are outweighed by the noise, size, and power consumption. Also, it is difficult and/or expensive to obtain a proper server rack and move it into place.

A Mac Mini is still attractive because of the size and power efficiency factor, but it loses points in my eyes for not being easily user-serviceable, and it costs twice as much.

I might actually opt for a quiet $500 desktop computer in a tower configuration that can be tucked away under a desk or inside a closet. The priorities to focus on would be a fast multi-core CPU, redundant hard drives if possible, and ignoring the video or multimedia cards. In terms of specifications, I’m not sure how little one can get away with in terms of the CPU, but with respect to hard drive space I can say that notes in OSCAR take up very little space. After 5 months of use my notes take up about 5 MB. Scanned documents are another story, accumulating at a rate of perhaps 10-20 MB per month. At that rate it would take a years to even add up to one GB but it is also important to note that OSCAR stores daily backup files for a month, so multiply however much space you think you will use by a factor of 30. If there are a extensive paper charts to be scanned, they will also require memory. My rough estimate based on my experience would be about 10 MB per patient’s paper chart, per year. Hard drive space is cheap these days, so 1 TB should be more than enough space, and affordable, if it’s too hard to do the math.

Next up – how does one go from computer-in-hand to running server? The server needs to be configured, of course.

EMR Software – why open source is important

The first step in setting up an Electronic Medical Record is to pick the software. There are many options to choose from, but the list can be narrowed significantly when considering that OntarioMD only considers certain software providers to be funding-eligible. In other words, in order to get money from the government, one needs to go with one of the options on their list, which must then meet a certain standard for functionality. At the time of this writing, there are at least 13 options on the OntarioMD funding-eligible list – still a lot to choose from.

Looking at software from the perspective of efficiency, it needs to have enough features to be functional and it needs to be simple enough that it does not take more time to do than keeping a paper chart. From a private practice psychiatrist’s perspective, those functions include appointment scheduling, record-keeping, prescriptions, creating and faxing consultation reports, and billing. Other extra functions are a bonus. I have not tried every software on the market (not even close) but I can say that when looking for software, it is important to make sure it does what is needed of it.

Functionality aside, let’s look at EMR software from the perspective of ease of maintenance and operating costs. These are more long-term issues which I will attempt to outline below. In my mind, the biggest factor to consider in both of these domains is vendor lock-in. Consider me paranoid, but using software that stores patient data in a proprietary format that would allow for a software provider to hold the data hostage does not sound like a good idea. Keep in mind that physicians need records to defend against complaints and lawsuits, not to mention keeping track of the care we are providing. Therefore, access to those records even decades into the future is extremely important.

Another factor to consider in the ongoing maintenance is longevity of the product. Software out of the box is great at the time. If it doesn’t change in ten years – not so great. Can you imagine using record-keeping software that still runs on Windows 95? What about Windows 3.1, or DOS? Even if software meets all of our needs at the present, standards of care and practice will change in the future. OHIP, for example, no longer accepts billings by diskette. In the future, all of our records may be connected by a network. It may become the standard of care to have clinical decision-making aids integrated into our software. We may need to make decisions based on individual patient parameters like genotype. If the software we use does not evolve, it will no longer be useful.

In order for an EMR to continue to evolve, it needs to be maintained. In order to be maintained, it either needs to be profitable (i.e. there is a market for it) so that a company will continue to work on it, or it needs to be backed by an enthusiastic user community. As I mentioned in my previous post, long as the Ontario government is giving away money, there is an artificial market for EMR software. Doctors have money to throw away, so entrepreneurs are happy to develop software to collect it. After the money dries up, what happens? It seems to me that before signing on for any particular software, it would be a good idea to determine how many people use it, in how many places, and for how long.

With this in mind, I propose that an open-source option would meet the needs for a non-proprietary format and product longevity. Open source means that the software is usually free in the sense of being very inexpensive, and more importantly, free in the sense that anyone can look at the source code and contribute improvements. Even if the original developer becomes defunct, the users of the software could band together to make sure the software continues to be supported, and the users do not have to depend on a development company to access their data.

In my search for open-source EMR software, only option that stood out – OSCAR, developed at McMaster University in 2001. It is the only OntarioMD funding-eligible option that is open-source, to my knowledge. Since OSCAR is open-source, changes and improvements can be made by anyone. There is also no licensing fee to use OSCAR, and since it runs on a MySQL database, the patient records are not tied up in any kind of proprietary code that a software company could use to hold one hostage. A fully-functional version of the software is freely available from the OSCAR EMR website – any interested party can download it, install it, and take it for a test drive. Therefore, it meets the needs for functionality, ease of maintenance, and low operating costs. It is also supported by a not-for-profit entity, OSCAR-EMR (similar, perhaps, to the way in which Canonical supports the development of Ubuntu). It seemed perfect for a self-maintained, do-it-yourself setup.

Next, we’ll take a look at setting up hardware to run a basic OSCAR EMR system.

The Do-It-Yourself EMR Experiment

Five months ago I decided to try an experiment – see if it would be possible to set up an Electronic Medical Record (EMR) on a limited budget. OntarioMD provides government funding to physicians who are looking to switch from paper to electronic records but the amount of funding is limited. (I applied in August 2013 and I got approved almost a year later). My question, therefore, was whether it is possible for a physician to set up an EMR independently of government funding. If EMR is the way of the future, why should it need to be heavily incentivized in order to get people to make the switch? If it really is better, then shouldn’t it be faster than paper, easy to maintain, and competitive with paper charts in terms of operating costs?

There are other reasons why I thought this experiment would have social value. Besides physicians who are new into practice, there are other professionals (e.g. naturopathic doctors, independent psychotherapists and counselors) who could use an EMR but who do not have access to funding.

In this series on EMRs, I’ll write about my experience trying to set up an EMR system keeping those three points in mind:

1. Efficiency
2. Operating costs
3. Maintenance

Efficiency is relevant because if the EMR, subsidized or not, is slower to operate than keeping paper charts in a filing cabinet, it does not make sense for the average doctor to make the switch. Yes, there are future visions of big medicine – connecting all of the EMRs in a network that would allow for information sharing and data mining (and government snooping, perhaps?) In the long term that may contribute to better care from a systems / population perspective, but from the perspective of most doctors I would argue that we care primarily about whether it helps us provide better care to the patient in front of us right now. If the record-keeping system slows us down or does not add any short-term benefit, it is not very attractive. If one spends a little bit of time on the online self-help forums or talking to colleagues it does not take long to hear stories about the physician who stays a couple of extra hours at the end of the day to finish typing paperwork, whose appointments run over time because of the extra time it takes to figure out how the prescription module of the software is supposed to work, or who (worst-case scenario) is not able to function at all because the computer is down.

Operating cost is also very important – a doctor could apply for the government funding and wait until it goes through before switching over, but the funding is only for a number of years and after that, the burden of maintenance goes back to the physician. In my mind, that means the EMR had better be easy and cost-effective to maintain after funding runs out, or else we would be foolish to jump on the bandwagon only to be saddled with the burden of maintaining aging computer hardware and continuing to pay service fees that mostly benefit the proprietary software companies and service providers that sprung up when the subsidy gravy was flowing.

Ease of maintenance is closely related to operating cost, but not exactly the same thing. Even if a doctor contracts out the maintenance of the EMR to a third party, the doctor is ultimately responsible for whether or not it works, because we are the ones who bear the consequences if it doesn’t. There are also factors to consider that a third-party service provider is not directly responsible for – the physician’s client computers (the ones used to access the server), the Internet connection, the physician’s time to learn how to use the software and ensure that staff know how to use it. Also, for a solo or small group practice that does not have in-house IT staff, the physician is more than likely going to be the one troubleshooting when there are small problems and therefore (in my opinion) should know how the system works. Consider, as an analogy, commuting to work on a bicycle. Of course, you can take your bike to a shop every time you want it tuned up. Even so, if it breaks down on the road, the rider’s tools and mechanical knowledge make the difference between calling a taxi or walking to work and getting up and riding again.

Next up, we’ll look at the first step in constructing the EMR – selecting the software – keeping the three above points in mind.

DIY Medicine

A big challenge that I’ve noticed in my work is how to get help for the people who need it – even when they take the step of asking for help, wait times are very long. However – it seems that a lot of psychiatry is not rocket science (as an example, see this article in The Guardian on the MANAS intervention, where lay people were trained to deliver effective psychotherapy).

This study really makes one think about how mental health could be provided for all people, by all people. I believe, for example, it is quite reasonable to argue that mindfulness training and interpersonal skills should be in the hands of the average person. In a sense, it still is – there are many places where one can learn to meditate for free. At the same time, there seems to be a trend in our society for everything to become ultra-specialized so that there is an expert for everything and less emphasis on disseminating skills for people to help themselves.

All of this brings me to the issue of how medical knowledge is disseminated. In the computer world we have open-source software – Freely available, free to modify, free to disseminate. This, in some ways, is the opposite of proprietary software, which makes the user reliant on an expert company or institution for updates and licenses. Is mental health a bit too much like Microsoft or Apple?

Psychiatrists could probably be doing a lot more to work with other people like artists, designers, or software developers, to package health information in a way that is more easily disseminated and accessed. We could put more power in the hands of people to help themselves. OHIP doesn’t compensate us to do that (but we would get nicely compensated for working in a hospital or an ER). Still – would wellness promotion and education be a more economic use of our time?