Businesses are constantly optimizing, particularly in the design and operation of complex systems. The engineering and business combination in the LGO curriculum uniquely prepares students to work on systems optimization problems in a variety of industries. For example, LGO students are uniquely qualified to work on complex healthcare systems engineering problems that can effect how many patients a hospital can treat. These problems all use large-scale analytics models to produce results.
Sequential Optimization for Prospective Customer Segmentation and Content Targeting
Kenny Groszman (LGO ’22)
Problem: A key challenge for ResMed’s ongoing growth is increasing awareness and treatment of obstructive sleep apnea (OSA), a highly prevalent and undiagnosed sleep disorder. Kenny’s project focused on an algorithm to improve the efficacy of online paid advertising as a means to increase OSA awareness and help undiagnosed patients get on the path to treatment.
Approach: Building on available online advertising performance data and web analytics data, Kenny developed and validated a sequential optimization framework (B-SMAC) to target online ads across customer segments efficiently and automatically. The algorithm was able to identify the best-performing target audience while using minimal experimental resources. Because this was the first marketing use case for ResMed’s data science team, Kenny had to develop stakeholder relationships and new data privacy procedures.
Impact: Ad targeting by Kenny’s model outperforms human expert-based ad targeting by 2.8x, and outperforms Facebook’s internal black-box targeting by 1.6x. In addition to this higher rate of performance, his model uniquely provides customer segment-level ad effectiveness insights that can be leveraged by the business. Kenny was able to transition his model and technology to the ResMed team to allow them to continue this approach going forward.
Predictive Modeling and Optimization of Autoinjector Manufacturing
Levi DeLuke (LGO ’21)
Problem: How can upstream subcomponent and process data that is predictive of final product performance and customer experience be identified and used to improve the manufacture of autoinjectors—a key medical device relied upon by millions? Within this broader challenge, Levi’s goal was to identify upstream subcomponent variability that was predictive of final device performance and to use a predictive model and linear optimization to “pair” available subcomponent batches together to ultimately reduce final batch variability.
Approach: Levi’s project focused on three areas: structuring data across product genealogy to create visualizations of correlations across manufacturing stages, predictive modeling framework to predict characteristics of final lot from upstream subcomponent data, and optimally pairing subcomponent batches to reduce final lot variability over conventional statistical process control measures.
Impact: This project supported ongoing predictive modeling efforts as well as process monitoring workstreams by highlighting which parameters are “important” for a given final testing characteristic for a given product. Levi’s work pointed to estimated 30-45% reductions in variability of a key final testing parameter of a specific finished autoinjector product based on simulations comparing historical variability using an optimal approach.
Implementation of Mathematical Approach to Rip Saw Arbor Design and Scheduling
Harry Birnbaum (LGO ’21)
Problem: AHF Products’ plant in Beverly, WV manufactures solid wood flooring (SWF). More than half of the delivered costs of SWF are from lumber and AHF’s plants strive to maximize “material yield”: the percentage of the untreated “green lumber” purchased from suppliers that eventually ends up in the box of finished flooring. Each manufacturing step generates a certain amount of scrap that contributes to the overall plant yield loss and yield has a large impact on the P&L of a plant.
Approach: Harry started with historical rip saw data on the dimensions of incoming boards and the boards they were ripped into, demand forecasts from sales, mill schedules, and work in progress inventory. Leveraging recent software implementation to optimize rip saw yield while meeting forecasted demand by improving data visibility, he developed best practices for rip saw configuration to improve overall mill yield and the margin profile of the plant.
Impact: Harry was able to develop a program that successfully reduced waste through its arbor design and scheduling recommendations. By incorporating the additional goal of demand adherence, occasionally at the short-term expense of material yield, plant margin profile could be improved. High volume lumber species saw a .9% improvement to material yield and lower volume species saw improvements to demand adherence. An additional benefit of this project was also the time it freed up for the individuals responsible for weekly mill planning.
Developing a Capacity Analysis Tool in a Vertically Integrated, High Mix, Low Volume Engineering Landscape
Hans Nowak II (LGO ’20)
Problem: Raytheon’s Circuit Card Assembly (CCA) factory is the largest Department of Defense CCA manufacturer in the world. Recently, two factors have increased the demand of the CCA factory: large-scale consolidation from three manufacturing factories into a single CCA factory; a rapidly growing market from new Raytheon technology, resulting in high volume of new product introductions. With Raytheon’s recent merger with UTC, the ability to continuously analyze factory capacity for new contracts and variable demand is needed.
Approach: For this project, Hans worked on three primary objectives. He first created a sustainable capacity analysis tool that automatically and continuously updated projected capacity utilization. Second, he used machine learning and other techniques to predict cycle times of future demand. Lastly, he provided data-driven recommendations from capacity analysis to maximize optimal strategic capacity planning decisions.
Impact: Hans’ strategy combined the capability of automated data mining algorithms, the predictive power of machine learning, and the optimization ability of mathematical programming. The models built enabled Raytheon to make better decisions when planning factory capacity in the long term, and get a clearer picture of operational health in the short term. It’s important to note they must be properly implemented and scaled to have sustainable effect on the company.
A Systems-Based Analysis Method for Safety Design in Rocket Testing Controllers
Jeremy Paquin (LGO ’19)
Problem: NASA’s Space Launch System (SLS), is an advanced launch vehicle for a new era of exploration beyond Earth’s orbit into deep space. Boeing is the prime contractor to build the Core Stage, and backbone of the SLS. The primary objective of Jeremy’s project was to improve Boeing’s ability to perform test firing of the current and future iterations of the SLS in a way that minimizes schedule risk and cost.
Approach: Jeremy’s project modeled the current stage controller processes. This included interviewing key stakeholders and subject matter experts, and top-down systems model development of key interactions. He also defined hazards/accidents, and key interactions, and held stakeholder In-Progress Reviews (IPRs) throughout. Jeremy then completed a Systems-Theoretic Process Analysis (STPA) on the Stage Controller Design, comparing unsafe control actions with previous analysis, comparing key performance metrics on previous analysis with STPA results and metrics.
Impact: The outcome of the project provided a framework for Stage Controller safety analysis for safer testing and pre-launch operations of the SLS core phase in a way that minimized cost and schedule risk. This framework is applicable to future space launch architecture development. Additionally, non-technical recommendations were suggested in the areas of testing operations and organization. Future work will examine extensions of STPA to software, and automation of certain steps.