Lab Home | Phone | Search | ||||||||
|
||||||||
Over the past decade, the United States has invested heavily in the development and deployment of remote sensing systems and novel analytics for intelligence, surveillance and reconnaissance activities throughout the national security community. But as algorithms, software and hardware have evolved in leaps and bounds, end-users increasingly find themselves swamped in systems that are simply unusable. It’s time to pause and ask ourselves hard questions about what we expect people to actually do with the systems we create: How can we make sure our analytics are adoptable, useful and usable for the people whose work we intend to 'improve'? Based on a decade’s work of human-information interaction research across the national security community, this talk examines the challenge of designing analytic systems to be usable, useful and adoptable in the complex, highly fractionated world of the national security workplace. I’ll discuss a set of practices and principles that our researchers are using to inform the development of analytic systems that actually for human users and operators, drawing on frameworks from human factors, design ethnography, and human-computer interaction. Finally, I’ll talk about the need for new human-information interaction evaluation frameworks that can help analytic system developers gain insight into the real-world usage of tools, so that we can better understand the perceptual and cognitive requirements of their human users. Host: Curt Canada |