Electronic data capture (EDC) systems allow sponsors to gather data electronically. In theory, that should be cheaper and faster. After a decade of gradual acceptance of the technology, all sponsors have used EDC systems and some rely on it heavily. Contract research organizations have followed suit. But what about clinical sites?

Individual EDC vendors often claim that sites cherish their technology. But many sites say the advent of EDC has added to the burden of work and strained their relationships with sponsors. Some of the issues, in the end, are artifacts of rendering regulatory requirements into software—or feeding a fetish for paper that exists in both American medicine and the sponsor community. But that's not to minimize the genuine frustration that sites feel with the technological underpinnings of most modern trials.

Christine Pierre, CEO of the 16-year-old site network RXTrials, and founder of Site Solutions Summit, says the technology isn't ready for prime time. “We definitely as an industry need to embrace technology far more than we are, it's just that the development of this particular technology has excluded the site's perspective," she says. "The sites get upset, call sponsors, and the circle of frustration begins.” There are multiple issues:

The technology requires duplicate or triplicate work flows. Patricia Larrabee, founder and CEO of Rochester Clinical Research, which does about 50 studies a year, says she remembers the first time a sponsor asked her to log trial data onto a laptop. “The original idea was to input information once, to enter data and be done,” she says. “And that was great, but that fell apart early on.” One change is the advent of electronic health record (EHR) systems for physicians and hospitals, which have entirely different features from those needed in industry-sponsored research.

The current reality at many sites is that paper and multiple technology systems are the rule. As a patient is seen, a site staff member first fills out a paper source document. Then he or she may have to enter the same information into an EHR system. Finally, the data may be rekeyed a third time—into an EDC system. Very few EDC systems are integrated with anything else at the site. So sites justifiably feel their work is doubling or tripling.

Then there are even more improbable inefficiences. Michael Koren, CEO of the Jacksonville Center for Clinical Research, which does about 150 studies a year, says he wonders why sponsors can't sync the EDC system on their end with what the site sees.

For example, he says, “a site like ours gets a protocol and it has the number of visits and procedures and things that need to be done, and if you're using an EDC system, someone on the site side has to manually enter all that in there. Someone's already doing it on the sponsor side. It's just silly that the two don't communicate. That would be a huge time saver.”

EDC systems tend to be slow, hard to navigate and prone to crash. Larrabee says that using a paper case report form (CRF) took 10 minutes to fill out by hand. “EDC takes two to three times that long,” she says. “Each page has to be sent and received before you can get to the next one, and it can take 30 seconds to send one page. It can take five minutes just to log in, and we're using 10 different systems at any one time.”

So in the era of EDC, investigators may find themselves parked in front of the computer for hours—and often on the weekends to avoid using time that should be spent with patients.

That's especially common when the database is about to be locked. No sooner than 24 hours before data lock, investigators have to sit down with the EDC system and, with many EDC solutions, sign off on each page of each patient visit. “Sometimes we have to sit there for three or four hours because the system is so slow,” says Mervyn Weerasinghe, medical director of Rochester Clinical Research. Of course, that's a requirement that can be traced back to federal regulations, not the whims of sponsors or EDC suppliers.

Says Larrabee: “EDC is really for data managers, not really for the people actually inputting the data.”

Frequent server crashes. Sites report these tend to happen right at database lock or enrollment deadlines, when lots of sites are using the systems at once. A few sponsors test their servers to ward off such crises, says Pierre. Most don't. And when things crash and burn, locking out the sites, data entry ceases. Performance metrics that come out of such an event make a site look unproductive, and there's no asterisk to explain that the stumble was due to server problems—not incompetence.

A common subset of that IT issue occurs during randomization, she says. Suppose enrollment is starting for a high-volume trial like a vaccine study, for which large sites might be enrolling up to 90 patients a day. The day dawns over the east coast and sites there begin seeing potential subjects. But when it's time to randomize them, and several sites try to connect to the interactive voice and randomization system (IVRS) at the same moment, the system crashes for several hours.

Potential study subjects are asked to wait. But they eventually are sent away and asked to return the next day. The server problem is subsequently fixed. Then the recruiting day begins for the west coast sites, which deliver all the subjects the sponsor needs. Enrollment closes. The next day, all the subjects who return to the east coast clinics are told: never mind, the trial is full. This sounds extreme, but Pierre says it's all too common. If one wanted to draw up a blueprint to antagonize patients, this would be a good start.

by Suz Redfearn 

Editor's note: the second half of this story is here.

d9A2t49mkex