Accelerating the Shift to Software Defined Visualization

Advances in visualization are essential for managing—and maximizing value from—the rising flood of data, the growing sophistication of simulation codes, the convergence of machine learning (ML) and simulation workloads, and the development of extreme-scale computers.

Software-defined visualization (SDVis) has emerged as one of the most promising and practical ways to deliver these advances. Taking advantage of modern, general-purpose processors, SDVis offers a scalable, highly performant approach to visualizing massive data sets without GPU hardware. By enabling simulation and visualization to run on the same platform hardware, SDVis not only produces high fidelity visualization but also reduces costs and complexity while increasing flexibility for scientists and system managers alike.

Four Intel Parallel Computing Centers (IPCCs) are working to expand the power of SDVis, driving higher performance, fidelity, usability, and scalability. We caught up with representatives of these centers to hear about some of their latest work, including breakthroughs that will be featured at conferences and in papers in the coming months. IPCC innovations are incorporated into updated versions of SDVis libraries such as OSPRay, Embree, and OpenSWR, and delivered to the HPC community through open-source and third-party software solutions. Many are also wrapped into a set of performance-optimized cluster reference architectures that Intel announced in June.

Collaborating to Accelerate Visualization

IPCCs work to develop relevant curriculum and actively modernize key community codes to run on industry-standard parallel architectures. Reflecting the importance of visualization technologies in driving insights and maximizing data value, four IPCCs focus on visualization algorithms and applications:

  • University of Stuttgart Visualization Research Center (VISUS). A central research institution at the University, VISUS concentrates on scientific visualization processes, visual analytics, visual computing, and computer graphics. VISUS developed and is now modernizing the MegaMol* system, which is used for analyzing molecular dynamics data and other massive particle-based data sets.
  • University of Tennessee Joint Institute of Computational Science. Among other projects, the UT IPCC has been using SDVis tools to advance VisIt* and other science applications and exploit new processor architectures on future exascale systems.
  • University of Texas Advanced Computer Center (TACC). The TACC IPCC is directed by Paul Navrátil and has been designated a Visualization Center of Excellence. It focuses on in-situ visualization, virtual and augmented reality for enhanced data analysis, and machine learning to improve image quality and visualization performance.
  • University of Utah. The Utah IPCC focuses on modernizing scientific visualization and computation on many-core architectures. IPCC activities are led by Valerio Pascucci, John R. Park professor of computer science, director of the university’s Center for Extreme Data Management Analysis and Visualization and a faculty member in the Scientific Computing and Imaging (SCI) Institute, and Chris Johnson, distinguished professor of computer science and SCI Institute faculty member.

IPCCs work independently and in concert, and often collaborate with national labs and other partner organizations. They use local computing resources along with resources such as the Stampede2 supercomputer at TACC.

For example, a project at the University of Utah IPCC is advancing ray tracing block-structured adaptive mesh refinement (BS-AMR) in OSPRay. Working with scientists from the NASA Ames Research Center, Stephen Hawking’s COSMOS Research Group, and others, the Utah team developed an algorithm to interactively render crack-free, implicit isosurfaces in combination with direct volume rendering and advanced shading effects such as transparency, ambient occlusion, and path tracing. They say their method, which they have packaged as an OSPRay module, is from 8 to 53 times faster than OSPRay’s previous, sample-based isosurface method. Its higher-fidelity rendering helps enable clearer, faster insights into everything from black hole mergers to the performance of spacecraft landing gear. Researchers will present their results this October at the IEEE VIS conference. The team has submitted a paper describing its approach to IEEE Transactions on Visualization and Computer Graphics for 2019 publication, and is preparing to release the OSPRay module to the open source community.

SDVis In-Situ and in the Cloud

IPCCs are also advancing SDVis capabilities for in-situ visualization, including remote and distributed rendering on devices such as tiled wall displays. By using the same platforms for simulation and visualization, in-situ visualization avoids bottlenecks caused by transferring huge data volumes to and from special-purpose graphics hardware. It allows scientists to interact with a simulation as it runs, modifying parameters in real time, and immediately seeing the effect on the phenomena they’re investigating. Adjusting parameters on the fly can also reduce computational requirements by short-circuiting unproductive simulation paths.

The University of Tennessee’s IPCC has developed a system to deliver high-fidelity volume rendering as a scalable, lightweight web service. Tapestry offers a platform-agnostic method of encapsulating and embedding interactive 3D volume rendering into the web’s standard Document Object Model (DOM). Backed by scalable server-side support, users can display and interact with 3D visualizations, rotating, zooming, and performing other functions. They can use these capabilities to collaborate with team members, as well as to engage and educate external stakeholders. Tapestry is available to the open source community at https://github.com/seelabutk/tapestry.

A massive wall display rendered with OSPRay is helping visitors to the City of Science and Industry Museum in Paris explore the properties of fire. The wall rendering is noteworthy for its size (1.5 gigapixels) as well as its complex mix of solid objects such as spheres and streamlines with volume rendering. The picture was developed by the University of Utah IPCC and is on display through January 2019. You can view it at full resolution with the VISUS web viewer by visiting VISUS.org.

Working with graphics experts from Intel, Kitware, and the University of Oregon, the TACC IPCC is leading the developing of a visualization benchmark suite they’ll use for studying the performance-of OpenSWR, the rasterization engine in the SDVis stack. They’re looking to confirm that OpenSWR makes CPU-based rasterization and rendering viable in cloud environments as well as on HPC platforms.

Integrating with Software

IPCCs are integrating SDVis and in-situ functionality into a number of widely used scientific and technical applications, improving user workflows and using the larger memories and other hardware elements to increase fidelity, performance, and scalability.

The Tennessee, Utah, and Stuttgart IPCCs have engaged in multi-year projects to enhance OSPRay performance, capabilities, and integration for VisIt on Intel Xeon and Intel Xeon Phi processors. VisIt originated at the Department of Energy, and is used for both data analysis and visualization. It’s also extremely popular, downloaded approximately 50,000 times annually. The VisIt optimizations for Intel architecture also integrate improvements to the Petroleum Industry Data Exchange (PIDX), making the results especially impactful for energy efficiency, oil and gas recovery, and the development of new energy sources.

The SDVis enhancements provide strong scaling for rendering, with as much as a 34x improvement to the overall frame rate. They also enable VisIt to perform interactive visualization and offline movie rendering—use cases that were previously problematic because of the lack of 64x threading in VisIt’s built-in ray caster. The SDVis enhancements are being migrated into VisIt with the anticipated fall 2018 release of VisIt 3.0 and the Visualization Toolkit (VTK) 8.1.(Read more: VisIt-OSPRay: Toward an Exascale Volume Visualization System.)

The Stuttgart IPCC is one year into modernizing MegaMol, a cross-platform framework for visualization research. The VISUS team has integrated MegaMol with OSPRay and is enabling it to run in a distributed fashion, either headless or in-situ on simulation nodes or separate CPU-based clusters. Focusing on visualization for molecular modeling, the VISUS team is also porting some of MegaMol’s application-specific abstractions for molecular data into OSPRay. With more memory available, scientists can explore larger cellular surfaces and more molecules simultaneously, meeting their need for detailed, biophysics-based, multi-scale models. VISUS is also adding solvent-excluded surface (SES) geometry, which includes valuable information for domain experts, for OSPRay. Running the code on Stampede2, MegaMol is able to visualize a 34 billion particle dataset interactively. The VISUS team is focusing on in-situ visualization interfacing the MegaMol framework with the molecular dynamics software ls1 or the continuum mechanics simulation package opendihu.

Expanded Realities and More

SDVis is also enhancing visualization and interactivity in augmented reality (AR), virtual reality (VR), and mixed-reality (MR) environments—approaches that often dovetail with wall-size display technologies and cloud-based visualization. One TACC project uses motion-capture technology at the TACC Visualization Laboratory to explore how gaze-tracking and motion-tracking might let them predict areas of interest in a tiled display wall. Integrated into OSPRay or tools such as ParaView, this approach could use human-movement-based AMR to direct a visualization, thereby reducing the amount of rendering needed while allowing for smoother AR and VR experiences. SDVis tools such as Tapestry are also being applied in movie rendering, ultra-high-resolution Powerwall* displays, the mixed-reality technologies of Microsoft HoloLens devices, and other use cases.

OSPRay is also helping researchers at the University of Utah, Lawrence Livermore National Laboratory, and other neuroscience labs begin using VR to examine walk-around models of the human brain. Early prototypes also use OSPRay to render a 360-degree panorama for examination by a seated viewer. (Read more at: A Virtual Reality Visualization Tool for Neuron Tracing.)

Artificial intelligence is also being applied to the challenges of visualization. In one case, a TACC team is conducting foundational research aimed at developing machine learning algorithms to automate the refinement of structured colormaps. Effective colormaps can highlight relevant features in complex visualizations, advancing the analysis of HPC data sets and the communication of complex results. Once integrated into the SDVis stack, ML-enhanced color maps will also improve the experience and productivity for users of ParaView, VisIt, and other SDVis-enabled software.

Platforms Optimized for Visualization Workloads

Complementing the work of the IPCCs, Intel continues its own activities to expand SDVis. With the new set of performance-optimized cluster reference architectures called Intel Select Solutions for Professional Visualization, the company aims to provide organizations with a fast path for purchasing and deploying clusters optimized for HPC visualization and simulation workloads. Two pre-validated configurations combine Intel hardware and software elements, including SDVis capabilities, with optimized open source technologies. The resulting platforms enable photorealistic images and high performance on large, complex datasets and allow scientists to solve and visualize their models simultaneously. Independent organizations can offer their own configurations as long as they meet or exceed Intel’s minimal performance thresholds and follow Intel’s software and hardware stack requirements.

Paving the Path to Exascale

From gravitational wave observatories to pharmaceutical research houses, visualization is increasingly critical to creating insights and value from massive data sets. Augmented by extreme-scale computers, advanced simulation software, and machine learning, the opportunities to obtain value from data-intensive simulations will skyrocket. IPCCs play key roles in delivering that value. By expanding the SDVis ecosystem and advancing the effectiveness of software-based visualization on general-purpose processors, the IPCCs are validating and furthering the performance, scalability, fidelity, and usability of CPU-based visualization to handle the largest scale data sets and the most complex simulation challenges. Along with preconfigured platforms that offer optimized performance and easy deployment for professional visualization, they’re paving the path to a new era of rapid insights and exascale computers.

____________________

Jan Rowell is a freelance writer concentrating on technology trends and impacts in HPC, healthcare, and other industries.

Sign up to our Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.