The Next Platform
  • Home
  • Compute
  • Store
  • Connect
  • Control
  • Code
  • AI
  • HPC
  • Enterprise
  • Hyperscale
  • Cloud
  • Edge
Latest
  • [ May 30, 2023 ] MGX: Nvidia Standardizes Multi-Generation Server Designs Compute
  • [ May 29, 2023 ] Nvidia’s Grace-Hopper Hybrid Systems Bring Huge Memory To Bear AI
  • [ May 25, 2023 ] Isambard 3 To Put Nvidia’s “Grace” CPU Through The HPC Paces HPC
  • [ May 25, 2023 ] Nvidia Hints At Upcoming AI-Focused Spectrum-4 Ethernet AI
  • [ May 24, 2023 ] In The Multicloud World, The Data’s The Thing Cloud
  • [ May 23, 2023 ] Aurora Rising: A Massive Machine For HPC And AI HPC
  • [ May 23, 2023 ] Boosting AI Storage With QLC Flash And Deduplication AI
  • [ May 22, 2023 ] How AI Is Going To Change Supercomputer Rankings Even More HPC
HomeBatch Size

Batch Size

AI

Google Chips Away at Problems at “Mega-Batch” Scale

August 9, 2021 Nicole Hemsoth Prickett 0

As Google’s batch sizes for AI training continue to skyrocket, with some batch sizes ranging from over 100k to one million, the company’s research arm is looking at ways to improve everything from efficiency, scalability, and even privacy for those whose data is used in large-scale training runs. …

About

The Next Platform is published by Stackhouse Publishing Inc in partnership with the UK’s top technology publication, The Register.

It offers in-depth coverage of high-end computing at large enterprises, supercomputing centers, hyperscale data centers, and public clouds. Read more…

Newsletter

Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Subscribe now

  • RSS
  • Twitter
  • Facebook
  • LinkedIn
  • Email the editor
  • About
  • Contributors
  • Contact
  • Sales
  • Newsletter
  • Books
  • Events
  • Privacy
  • Ts&Cs
  • Cookies
  • Do not sell my personal information

All Content Copyright The Next Platform