IBM Research will contribute its expertise in medical-dictation processing to Nuance Communications with the goal of using speech recognition to incorporate structured data into electronic health records.
and Nuance Communications have announced a partnership in speech recognition to
direct physician-dictated text into the structured fields of an EHR (electronic
is the maker of the Dragon
, which in addition to health care and other
industries is used in the White House and Defense Department.
to Peter Durlach, Nuance Communications' senior vice president of marketing and
product strategy for health care, the collaboration between IBM
and Nuance will use IBM's research in NLP
(natural language processing) to enhance Nuance's CLU
(Clinical Language Understanding) software products.
CLU technology is a health-care-specific
type of NLP that involves extracting specific data about a patient's condition
from the narrative text dictated by a physician or nurse. CLU
is a core element of EHR workflows, according to Nuance.
will combine some of the work IBM has done
in the natural language processing area with the work we're already doing at
Nuance to tackle that big problem in health care, which is, How do you get
structured data out of the narrative part of the dictation?" Durlach
explained to eWEEK.
new partnership will enable IBM Research to
develop improved technology where the information extraction system can benefit
from structured knowledge such as a medical ontology of symptoms to develop an
understanding of the dictated text," Salim Roukos, IBM's
senior manager of NLP Technologies, wrote in an e-mail to eWEEK.
to Roukos, IBM has been developing new ways
to extract discrete information from dictated material using advanced text
narrative part of a doctor's dictation involves an unformatted, or
unstructured, section of text that an EHR is unable to process into separate
fields. "If you don't get that structured data, all that narrative is just
there as a blob of text in the database," Durlach said. "It's very
hard to do decision making after the fact because you don't have structured
field level data."
said the collaboration with IBM will allow the
CLU application to "parse [data] out of
that narrative blob of text and populate EHR fields so you can do data
mining." Physicians will not have to key in data to the EHR fields
compared the use of CLU to being able to
extract mentions of specific companies from a radio or television broadcast on
has posted a video
of a doctor using CLU
technology to dictate a patient's diagnosis and treatment plan into an
the approximately 2 billion medical reports dictated per year in the United
States, most are dictated by physicians
using this narrative process, Durlach noted.
still need to use the narrative format and have the CLU
technology format it, he explained.
EHRs, they require physicians to point and click through multiple screens. The
physicians can't stand it because it slows them down-they're very
awkward," Durlach added.
[electronic medical records, or EHRs] have a significant amount of freely
dictated/written unstructured doctors' notes and comments," Roukos
explained. "The CLU analyzes this
unstructured text and extracts facts into structured tables such as any
allergies a patient might have. The structured data can be used to automate any
checks to improve on health care services."
can identify and pull data on medical problems, social history, allergies and
medications from the narrative text, according to Nuance.
CLU technology can also alert health care
providers to previous information about a patient, Roukos said.
such as 3M
also offer dictation apps for health care.