Some LGO students dive deep into scientific research topics at their internship company. These projects use cutting-edge technologies to solve unique problems or create new discoveries. These projects use engineering research practices and find ways to use those discoveries in a business context. These projects often result in a new product or design.
Kerry Weinberg (LGO ’16)
Location: Boston, MA
Problem: The Chinese hamster ovary (CHO) system is used to manufacture therapeutic proteins, which have a wide variety of medical uses. Insulin was the first therapeutic protein used in a systematic way. Scientists need a robust biological understanding to optimize the CHO bioprocess. To do this, researchers can analyze and synthesize biological data on transcriptome (gene expression), proteome (protein levels) or metabolome (metabolite levels). Kerry focused on streamlining and standardizing the transcriptomic data analysis and used this streamlined workflow to mine Amgen historical datasets.
Approach: The streamlined workflow reduced the barrier to entry and cycle time to conduct data mining. Scientists and engineers in Amgen can also use it to efficiently analyze gene expression data.
By mining historical Amgen datasets using the streamlined workflow, Kerry identified gene expression signatures indicative of hyper-productivity. Specifically, she found key biological pathways specific to a highly productive Amgen cell line. Interestingly, these pathways had not previously been identified as hyper-productivity traits. This work suggests that these pathways are critical to heightened levels of protein production.
Impact: Using this information to engineer future cell lines could enable Amgen to improve cellular protein production by over 30%, dramatically impacting costs associated with drug substance manufacturing. More broadly, this example of streamlining and standardizing transcriptomic data provides a framework for how Amgen Process Development can leverage biological data to improve CHO systems understanding and achieve operational impacts.
Andrew Byron (LGO ’16)
Company: Raytheon Missile Systems
Location: Tucson, AZ
Problem: Additive manufacturing (AM) is a new way to digitally manufacture complicated structures. AM has many strengths. It can shorten lead times, reduce material waste, and simplify the overall process. However, there are no common procedures and standards throughout the industry, which limits AM collaborative development. Useful information is difficult to find. Raytheon Missile Systems recognized that they could accelerate the development of advanced missile systems by leveraging the strengths of AM, and they asked Andrew to find a way to do that.
Approach: To qualify AM for use on Raytheon’s flight-critical parts, Andrew focused on three specific objectives:
- Create a standard process building on Raytheon’s existing product development knowledge and allow product teams from all over the company to easily and quickly produce qualification plans for their chosen material and application.
- Select and identify key process controls and their corresponding experimental responses in the metals AM technology currently funded by Raytheon. Limited public information requires first-hand experience and experimentation to confirm the theory.
- Use statistical methods for experimental design. Andrew used variance analysis and regression model fitting and provided a template for future experiments at Raytheon.
Impact: The experiment successfully established a procedure for qualification tests. Initial results are promising for continued use of Andrew’s methods. Andrew created a qualification test plan and process to help Raytheon integrate metals AM technology into new programs. The process will continue to be refined and is designed to be flexible for multiple materials and manufacturing techniques.
Mario Orozco (LGO ’16)
Company: Dell Inc.
Location: Austin, TX
Problem: At Dell, many departments are eager to apply artificial intelligence to their operations. However, can “intelligence” be applied at Dell, and if so, where and how? Dell asked Mario to explore AI technologies and identify those that are a fit with Dell (for business optimization and internal applications).
Approach: Mario brainstormed over thirty potential applications at Dell and focused on security, serviceability, manageability, and productivity. To prioritize the concepts, he applied a framework that took into account tech readiness, financial opportunities, IP potential, Dell fit, and reputation. A detailed business and technology case was done for the top four concepts, and self-management and healing (computer hardware failure prevention) was selected for further analysis. Currently, computer failures are treated post-event, which is a problem for both users and service providers. The new Dell Data Vault software captures hundreds of variables from the hardware components, alert type, and failure type in a computer or laptop. Applying a data science approach, Mario created a model to prevent hardware failures by warning the user or self-correcting.
Impact: The project had a potential direct impact in the millions of dollars, but two immediate solutions were recommended to pursue directly due to higher impact and technology readiness. Machine learning analysis can prevent computer hardware failure. Machine learning technologies presents great opportunities in cost reduction/avoidance, customer satisfaction, and creating new demand for products.