Pitt | Swanson Engineering

Join With Us In Celebrating Our 2020 Graduating Class! 

Since its founding in 1893 by two legends, George Westinghouse and Reginald Fessenden, the Department of Electrical and Computer Engineering at Pitt has excelled in education, research, and service.  Today, the department features innovative undergraduate and graduate programs and world-class research centers and labs, combining theory with practice at the nexus of computer and electrical engineering, for our students to learn, develop, and lead lives of impact.


Read our latest newsletter below




Jun
25
2020

Making a Sustainable Impact Throughout Pitt and Our Communities

All SSoE News, Bioengineering, Chemical & Petroleum, Civil & Environmental, Electrical & Computer, Industrial, MEMS, Student Profiles, Office of Development & Alumni Affairs

"MCSI remains committed to addressing global sustainability issues, connecting our domestic and international pursuits to create synergies locally, nationally, and internationally. We hope you enjoy this summary of the past year’s impacts, and we'd be happy to answer any questions you might have about the report's contents and MCSI's programs."

Jun
23
2020

Five Pitt Researchers Receive PA Department of Community and Economic Development Grants

Electrical & Computer, MEMS

PITTSBURGH (June 23, 2020) — Five researchers at the University of Pittsburgh Swanson School of Engineering have received grants from the Pennsylvania Department of Community and Economic Development (DCED) through the Manufacturing PA initiative. The DCED has approved more than $2.8 million in grants to 43 projects that will “spur new technologies and processes in the manufacturing sector,” according to their press release. “As engineers, we are applied scientists, and our singular goal in performing research is to produce public impact,” said David Vorp, associate dean for research and John A. Swanson Professor of bioengineering. “I am proud that the Commonwealth of Pennsylvania saw the potential of these projects by our Swanson School faculty and their industrial partners to have benefit to their citizens.” The five researchers to receive funding at the Swanson School are: Kevin Chen, Paul E. Lego Professor of Electrical and Computer Engineering$67,991—Femtosecond Laser Manufacturing of 3D Photonics Components in Nonlinear Optical Substrates for Electro-Optic Applications Markus Chmielus, associate professor of mechanical engineering and materials science$70,000—Improving 3D Binder Jet Printed Tungsten-Carbide Parts via Strategies to Increase Green Density and Strength Jung-Kun Lee, professor of mechanical engineering and materials science$70,000—Smart Crucible: Monitoring Damage of Crucibles by Embedded Electric Resistance Sensor Albert To, associate professor of mechanical engineering and materials science$69,450—A Computational Tool for Simulating the Sintering Behavior in Binder Jet Additive Manufacturing Xiayun Zhao, assistant professor of mechanical engineering and materials science$70,000—Pushing the Boundaries of Ceramic Additive Manufacturing (CAM) with Visible light initiated Polymerization (ViP)
Maggie Pavlick
Jun
18
2020

Researching resilience

Electrical & Computer

Grid and infrastructure resilience are increasingly important, while a relatively ‘new concept’ in terms of today’s modern grid, and its dynamic environment. With the increase in natural disasters, and as the northern hemisphere goes into what is commonly known as ‘storm season’, Smart Energy International spoke with Dr. Alexis Kwasinski, Associate Professor at the Department of Electrical and Computer Engineering at the University of Pittsburgh. Kwasinski specializes in grid resilience research in areas prone to natural disasters and extreme weather. Read the full article.
Smart Energy International Issue 3 2020
Jun
10
2020

Pitt ECE Professor Receives $300K NSF Award to Develop 2D Synapse for Deep Neural Networks

Electrical & Computer

PITTSBURGH (June 10, 2020) — The world runs on data. Self-driving cars, security, healthcare and automated manufacturing all are part of a “big data revolution,” which has created a critical need for a way to more efficiently sift through vast datasets and extract valuable insights. When it comes to the level of efficiency needed for these tasks, however, the human brain is unparalleled. Taking inspiration from the brain, Feng Xiong, assistant professor of electrical and computer engineering at the University of Pittsburgh’s Swanson School of Engineering, is collaborating with Duke University’s Yiran Chen to develop a two-dimensional synaptic array that will allow computers to do this work with less power and greater speed. Xiong has received a $300,000 award from the National Science Foundation for this project. “Deep neural networks (DNN) work by training a neural network with massive datasets for applications like pattern recognition, image reconstruction or video and language processing,” said Xiong. “For example, if airport security wanted to create a program that could identify firearms, they would need to input thousands of pictures of different firearms in different situations to teach the program what it should look for. It’s not unlike how we as humans learn to identify different objects.” To do this, supercomputing systems transfer data back and forth constantly from the computation and memory units, making DNNs computationally intensive and power hungry. Their inefficiency makes it impractical for them to be scaled up to the level of the complexity needed for true artificial intelligence (AI). In contrast, computation and memory in the human brain uses a network of neurons and synapses that are closely and densely connected, resulting in the brain’s extremely low power consumption, about 20W. “The way our brains learn is gradual. For example, say you’re learning what an apple is. Each time you see the apple, it might be in a different context: on the ground, on a table, in a hand. Your brain learns to recognize that it’s still an apple,” said Xiong. “Each time you see it, the neural connection changes a bit. In computing we want this high-precision synapse to mimic that, so that over time, the connections strengthen. The finer the adjustments we can make, the more powerful the program can be, and the more memory it can have.” With existing consumer electronic devices, the kind of gradual, slight adjustment needed is difficult to attain because they rely on binary, meaning their states are essentially limited to on or off, yes or no. The artificial synapse will instead allow a precision of 1,000 states, with precision and control in navigating between each. Additionally, smaller devices, like sensors and other embedded systems, need to communicate their data to a larger computer to process it. The proposed device’s small size, flexibility and low power usage could make it able to run those calculations in much smaller devices, allowing sensors to process information on-site. “What we’re proposing is that, theoretically, we could lower the energy needed to run these algorithms, hopefully by 1,000 times or more. This way, it can make power requirement more reasonable, so a flexible or wearable electronic device could run it with a very small power supply,” said Xiong. The project, titled “Collaborative Research: Two-dimensional Synapatic Array for Advanced Hardware Acceleration of Deep Neural Networks,” is expected to last three years, beginning on Sept. 1, 2020.
Maggie Pavlick
Jun
8
2020

Shedding a New Light on 2D Materials

Electrical & Computer

PITTSBURGH (June 8, 2020) … In the information age, where we ditch paper files and cabinets for digital files and hard drives, there is an imminent need for affordable and efficient ways to store our information. At the beginning of 2020, the digital universe was estimated to consist of 44 zettabytes of data -- that’s 44 trillion gigabytes (GB) of information. Every time someone “googles” a question, uploads a photo to social media, or performs a variety of daily activities, that number increases. The University of Pittsburgh’s Nathan Youngblood and Feng Xiong secured a $501,953 award from the National Science Foundation to better understand how to store data more efficiently using optical and electrical techniques on two-dimensional (2D) materials. Optical storage, commonly used in rewritable CDs and DVDs, uses a laser to store and retrieve data in what is called a “phase-change material.” Heating these materials causes them to switch between two stable states, where the atoms are either randomly positioned like in glass or ordered like in a crystal. However, the amount of energy required to heat these materials is fundamentally limited by their volume. “Encoding data in 2D materials, which are atomically flat, provides a direct route to overcome this fundamental limitation,” said Youngblood, assistant professor of electrical and computer engineering at Pitt’s Swanson School of Engineering and lead researcher on the study. “If you reduce the dimensions of the material, it becomes much more efficient because the amount of energy needed to program data is proportional to the area rather than the volume of the material,” he continued. Modern 2D materials were first studied in the early 2000s. Their crystalline structure, consisting of a single layer of atoms, demonstrated a variety of useful properties, inspiring research into hundreds of other 2D materials, including MoTe2 -- the compound used in this project. “MoTe2 is useful for this application because it is predicted to be the most energy efficient, but due to a lack of experimental data, the way that light affects this material is elusive,” said Youngblood. Youngblood and Xiong will work in collaboration with Steven Koester, professor of electrical and computer engineering at the University of Minnesota and an expert in 2D materials, to examine how MoTe2 interacts with the light used in optical storage. “Our goal is to use time-dependent light-based measurement techniques along with advanced imaging and characterization at the atomic level,” said Xiong, assistant professor of electrical and computer engineering at Pitt’s Swanson School of Engineering. Storing and retrieving data in atomically flat materials like MoTe2 could allow highly-efficient processors for machine learning where the computation physically occurs in the memory cell itself. This approach is known as “in-memory” computing and has been demonstrated to be much faster than digital computers--though up to now has used three-dimensional materials like those used in rewritable optical discs. “A better understanding of MoTe2 properties will allow us to advance this technology and improve the use of 2D materials for high-speed, reliable and efficient memory and computation,” said Youngblood. # # #

Upcoming Events


back
view more