Adam Coscia is a Human-Centered Computing Ph.D. student at Georgia Tech and works closely with Alex Endert as a member of the Visual Analytics Lab. His work focuses on designing and developing visual analytics tools to aid individuals in comprehending complex data, mitigating biases, and facilitating decision-making processes. Currently, his primary focus is assisting NLP researchers and engineers in analyzing and interpreting insights gained from large language models (LLMs) throughout their training, inference, and comparative performance stages. Beyond academia, Coscia has contributed to realworld applications, partnering with scientists engaged in field and mission operations. Together, they’ve crafted visual analytics solutions to support deep ocean expeditions and enhance scientist-guided spacecraft autonomy. Coscia’s journey has involved collaboration with professionals from various institutions, including Vanderbilt, Georgia State, Tufts, Emory, Caltech, ArtCenter, Monterey Bay Aquarium Research Institute (MBARI), and NASA Jet Propulsion Laboratory (JPL).
What is your primary research area(s) with AI-ALOE?
My research goal with AI-ALOE is to enable end-to-end evaluation of large language models (LLMs) deployed in novel educational technologies. The learning community has begun integrating LLMs like ChatGPT into adaptive learning tools for improving adult education. The benefits of LLMs are numerous: enabling new methods to generate educational content, powering new tools for automatically evaluating student work, providing new interfaces for assisting students in the learning process, and more. However, multiple stakeholders, from linguistics researchers to learning engineers and even teachers, are concerned with safely deploying LLM-powered technology in critical learning environments. Engaging in a humancentered design process, I have begun designing, developing, and deploying actionable and interpretable data visualizations that help stakeholders understand and validate LLM performance through an interactive and scalable interface with LLM data, ultimately making the decisions that our AI produces more transparent and responsible.
What motivates and guides your research at AI-ALOE?
My research addresses ethical concerns with deploying LLMs in educational tools, ensuring fairness and transparency. By developing visualization tools, I aim to enable stakeholders to understand and interpret LLMs’ performance effectively.
How did you become interested in your research field?
My interest in explaining complex data led me to pursue a PhD in Human-Centered Computing at Georgia Tech. I design visual analytics tools to help people understand and explain AI systems deployed in critical environments like classrooms.
Do you have a favorite hobby outside of research?
Lots! I love sports. I play baseball, softball, golf, and tennis. I enjoy bouldering, hiking, swimming, and snowboarding in the winter. I go for lots of runs, and I recently started bodybuilding! I play lots of video games. I want to try my hand at game design and development someday. I also enjoy reading fantasy novels like “The Lord of the Rings.” I cook every day, trying new recipes and techniques all the time! From time to time, I enjoy trying different art projects such as making pottery, painting, and drawing. And of course, I love doing all of these activities with my friends and family.
Could you share an interesting and enjoyable tidbit about yourself?
Fun facts— I taught snowboarding professionally for 5 years in Eastern Pennsylvania, and I’ve interned at NASA’s Jet Propulsion Laboratory (JPL) twice!
RECENT PUBLICATION Adam Coscia, Langdon Holmes, Wesley Morris, Joon Suh Choi, Scott Crossley, and Alex Endert. 2024. iScore: Visual Analytics for Interpreting How Language Models Automatically Score Summaries. In 29th International Conference on Intelligent User Interfaces (IUI ’24), March 1821, 2024, Greenville, SC, USA. ACM, New York, NY, USA, 16 pages. https://doi.org/10.1145/3640543.3645142