top of page

Conference Honoree and Keynote: Roger Hoerl

Roger Hoerl Photo.jpg

Roger W. Hoerl is Brate-Peschel Professor of Statistics at Union College, in Schenectady, NY.  Previously, he led the Applied Statistics Lab at GE Global Research.  Hoerl has been named a Fellow of the American Statistical Association and the American Society for Quality, and has been elected to the International Statistical Institute and International Academy for Quality.  He has received the Brumbaugh, Bisgaard, and Hunter Awards, as well as the Shewhart Medal, from the American Society for Quality, and the Founders Award and Deming Lectureship Award from the American Statistical Association. His introductory text Statistical Thinking: Improving Business Performance, co-authored with Ronald Snee and now in its 3rd edition, was described as “…probably the most practical basic statistics textbook that has ever been written within a business context” by the journal Technometrics. 

The Future of Statistics in an AI Era

It seems that the statistics discipline has been in a state of flux since the arrival of “data science” as a recognized discipline fifteen or so years ago. There has been competition with, and perhaps to some degree, capitulation to the newer discipline, which arose primarily out of computer science. More recently, the arrival of generative AI, in the form of large language models and chat boxes, has provided another shock to the system, raising additional questions for the statistics profession. Where does statistics as a discipline stand now? How does it fit into the emerging ecosystem for analytics? What are the key questions that the discipline needs to answer to provide the brightest possible future? I will attempt to provide some answers to these difficult questions, referring back to previous times of flux, such as the growth and dominance of mathematical statistics beginning fifty or so years ago. My primary message is that perhaps the statistics discipline is selling itself short, and does not fully appreciate the perspective and skills that it, and no one else, can provide.

Plenary Speaker: Kamran Paynabar

headshot_revised.jpeg

Kamran Paynabar is the Fouts Family Chair and Professor in the H. Milton Stewart School of Industrial and Systems Engineering at Georgia Tech. His research focuses on the methodological and applied aspects of machine learning and statistical modeling integrated with engineering applications. His work has been supported by agencies such as NSF, NIH, DOE, the U.S. Air Force, and industry leaders including Samsung, Ford, and Boeing. Professor Paynabar's research has earned best paper awards from INFORMS, IISE, ASA, and POMS. He has also received multiple teaching honors from Georgia Tech, IISE, and the NAE. He currently serves as the Editor-Elect of Technometrics and has previously served as a Department Editor for IISE Transactions and as an editorial board member for the Journal of Quality Technology. He has chaired the Quality and Reliability divisions of both IISE and INFORMS and is recognized as a Fellow of both ASQ and IISE, as well as an elected member of ISI. Additionally, he is a co-founder of ProcessMiner, a startup specializing in AI and machine learning solutions for quality improvement in continuous manufacturing.

Low Dimensional Learning from High Dimensional Data for Quality Modeling and Improvement

 

Industry 4.0 has enabled large-scale, agile data collection through cyber-physical systems and the Internet of Things, creating unprecedented opportunities for quality modeling and performance improvement. However, the complex structure of this data introduces significant analytical challenges, characterized by high variety, high dimensionality, high velocity, and intricate spatial-temporal structures. In this talk, I will present our efforts to develop efficient methods for learning low-dimensional structures from complex high-dimensional data. These methods are designed to address core challenges in system modeling, monitoring, control, prognostics, and optimization. I will also highlight their validation across a range of application domains, including additive manufacturing, the automotive sector, solar energy, silicon wafer fabrication, and microstructural materials analysis.

Plenary Speaker: Jennifer Mullekom

Van Mullekom BW Grey Background.png

After a 20-year career in industry, Dr. Jennifer Van Mullekom joined Virginia Tech in Fall 2016 as the Director of the Statistical Applications and Innovations Group (SAIG) where she is a Professor of Practice in the Department of Statistics. In addition to directing SAIG, she teaches collaboration skills and design of experiments to graduate students while serving as an active member of the global statistical practice community. Formerly, she was a Senior Consulting Statistician and Certified Six Sigma Master Black Belt in DuPont's Applied Statistics Group, supporting the DuPont Protection Technologies business.  At DuPont, she provided statistical leadership to the Tyvek® Medical Packaging Transition Project in the areas of product development, quality, commercialization, and regulatory. Her contributions to this project earned her a DuPont Engineering Excellence Award, one of the company’s highest honors. She continues to collaborate with DuPont on various projects to maintain and grow her skills as a professor of practice. Jen is active in professional societies, holding leadership roles in the American Statistical Association (ASA) and the American Society for Quality (ASQ).  She is an inventor on two US Patents and has also worked at Lubrizol and Capital One. Dr. Van Mullekom is a regular participant at the Conference on Statistical Practice on topics such as communication, collaboration, leadership, and ethics. In 2024, Jen was honored with the American Statistical Association’s Section on Statistical Consulting Mentoring Award for her role mentoring junior employees, colleagues, and students.  She holds an MS and PhD in Statistics from Virginia Tech and a BS in Mathematics and a BS ED in Mathematics Education from Concord University.

Measuring Quality in the Age of Artificial Intelligence and Machine Learning:  

Emerging Challenges and Classical Foundations

 

Concepts around quality management have evolved over the years.  This evolution began with the democratization of statistical analysis through software in the 1990s followed by the integration of data repositories and real-time monitoring tools in the 2000s and beyond. The quality community is now entering a new era with the proliferation of Machine Learning (ML) and Artificial Intelligence (AI). While the foundational principles of quality management methods such as Lean and Six Sigma remain relevant, the paradigm has shifted. Traditional quality engineering must adapt to data-driven, probabilistic AI systems, where the concept of a defect may be ambiguous or evolve over time. Quality engineers need implementable, modernized frameworks, tools, and methods to ensure the reliability, consistency, and safety of AI systems. As a community of applied researchers and practitioners, it is our responsibility to collaborate and rise to meet these challenges. Today’s research becomes tomorrow’s commonly used methods, implemented in quality management software and systems. This talk will focus on emerging challenges and classical foundations in three key areas: 1) post-deployment ML model monitoring, 2) generative AI systems, and 3) agentic AI systems. In addition, the ethics of AI/ML quality will be addressed relevant the American Statistical Association’s (ASA) Ethical Guidelines for Statistical Practice and the ASA’s Statement on Ethical Artificial Intelligence. While you may leave the talk with more questions than answers, my hope is to inspire you to engage and collaborate to create solutions to these challenges.

image.png
image.png
bottom of page