[go: nahoru, domu]

opensource.google.com

Menu
Showing posts with label events. Show all posts
Showing posts with label events. Show all posts

OpenXLA Dev Lab 2024: Building Groundbreaking ML Systems Together

Thursday, May 9, 2024


AMD, Arm, AWS, Google, NVIDIA, Intel, Tesla, SambaNova, and more come together to crack the code for colossal AI workloads

As AI models grow increasingly complex and compute-intensive, the need for efficient, scalable, and hardware-agnostic infrastructure has never been greater. OpenXLA is a deep learning compiler framework that makes it easy to speed up and massively scale AI models on a wide range of hardware types—from GPUs and CPUs to specialized chips like Google TPUs and AWS Trainium. It is compatible with popular modeling frameworks—JAX, PyTorch, and TensorFlow—and delivers leading performance. OpenXLA is the acceleration infrastructure of choice for global-scale AI-powered products like Amazon.com Search, Google Gemini, Waymo self-driving vehicles, and x.AI's Grok.


The OpenXLA Dev Lab

On April 25th, the OpenXLA Dev Lab played host to over 100 expert ML practitioners from 10 countries, representing industry leaders like AMD, Arm, AWS, ByteDance, Cerebras, Cruise, Google, NVIDIA, Intel, Tesla, SambaNova, and more. The full-day event, tailored to AI hardware vendors and infrastructure engineers, broke the mold of previous OpenXLA Summits by focusing purely on “Lab Sessions”, akin to office hours for developers, and hands-on Tutorials. The energy of the event was palpable as developers worked side-by-side, learning and collaborating on both practical challenges and exciting possibilities for AI infrastructure.

World map showing where developers come from across countries to the OpenXLA Dev Lab
Figure 1: Developers from around the world congregated at the OpenXLA Dev Lab.

The Dev Lab was all about three key things:

  • Educate and Empower: Teach developers how to implement OpenXLA's essential workflows and advanced features through hands-on tutorials.
  • Offer Expert Guidance: Provide personalized office hours led by OpenXLA experts to help developers refine their ideas and contributions.
  • Foster Community: Encourage collaboration, knowledge-sharing, and lasting connections among the brilliant minds in the OpenXLA community.

Tutorials

The Tutorials included:

Integrating an AI Compiler & Runtime into PJRT

  • Learn how PJRT connects ML frameworks to AI accelerators, standardizing their interaction for easy model deployment on diverse hardware.
  • Explore the PJRT C API for framework-hardware communication.
  • Implement a PJRT Plugin, a Python package that implements the C API.
  • Discover plugin examples for Apple Metal, CUDA, Intel GPU, and TPU.

Led by Jieying Luo and Skye Wanderman-Milne


Extracting StableHLO Graphs + Intro to StableHLO Quantizer

  • Learn to export StableHLO from JAX, PyTorch, and TensorFlow using static/dynamic shapes and SavedModel format.
  • Hack along with the tutorial using the JAX, PyTorch, and TensorFlow Colab notebooks provided on OpenXLA.org.
  • Simplify quantization with StableHLO Quantizer; a framework and device-agnostic tool.
  • Explore streamlined parameter selection and model rewriting for lower precision.

Led by Kevin Gleason, Jen Ha, and Xing Liu


Optimizing PyTorch/XLA Auto-sharding for Your Hardware

  • Discover this experimental feature that automates distributing large-scale PyTorch models across XLA devices.
  • Learn how it partitions and distributes for out-of-the-box performance without manual intervention
  • Explore future directions such as customizable cost models for different hardware

Led by Yeounoh Chung and Pratik Fegade


Optimizing Compute and Communication Scheduling with XLA

  • Scale ML models on multi-GPUs with SPMD partitioning, collective communication, HLO optimizations.
  • Explore tensor parallelism, latency hiding scheduler, pipeline parallelism.
  • Learn collective optimizations, pipeline parallelism for efficient large-scale training.

Led by Frederik Gossen, TJ Xu, and Abhinav Goel


Lab Sessions

Lab Sessions featured use case-specific office hours for AMD, Arm, AWS, ByteDance, Intel, NVIDIA, SambaNova, Tesla, and more. OpenXLA engineers were on hand to provide development teams with dedicated support and walkthrough specific pain points and designs. In addition, Informational Roundtables that covered broader topics like GPU ML Performance Optimization, JAX, and PyTorch-XLA GPU were available for those without specific use cases. This approach led to productive exchanges and fine-grained exploration of critical contribution areas for ML hardware vendors.

four photos of participants and vendors at OpenXLA Dev Lab

Don’t just take our word for it – here’s some of the feedback we received from developers:

"OpenXLA is awesome, and it's great to see the community interest around it. We're excited about the potential of PJRT and StableHLO to improve the portability of ML workloads onto novel hardware such as ours. We appreciate the support that we have been getting." 
      — Mark Gottscho, Senior Manager and Technical Lead at SambaNova
"Today I learned a lot about Shardonnay and about some of the bugs I found in the GSPMD partitioner, and I got to learn a lot of cool stuff." 
      — Patrick Toulme, Machine Learning Engineer at AWS
“I learned a lot, a lot about how XLA is making tremendous progress in building their community.” 
      — Tejash Shah, Product Manager at NVIDIA
“Loved the format this year - please continue … lots of learning, lots of interactive sessions. It was great!” 
      — Om Thakkar, AI Software Engineer at Intel

Technical Innovations and The Bold Road Ahead

The event kicked off with a keynote by Robert Hundt, Distinguished Engineer at Google, who outlined OpenXLA's ambitious plans for 2024, particularly three major areas of focus:

  • Large-scale training
  • GPU and PyTorch compute performance
  • Modularity and extensibility

Empowering Large-Scale Training

OpenXLA is introducing powerful features to enable model training at record-breaking scales. One of the most notable additions is Shardonnay, a tool coming soon to OpenXLA that automates and optimizes how large AI workloads are divided across multiple processing units, ensuring efficient use of resources and faster time to solution. Building on the success of its predecessor, SPMD, Shardonnay empowers developers with even more fine-grained control over partitioning decisions, all while maintaining the productivity benefits that SPMD is known for.

Diagram of sharding representation with a simple rank 2 tensor and 4 devices.
Figure 2: Sharding representation example with a simple rank 2 tensor and 4 devices.

In addition to Shardonnay, developers can expect a suite of features designed to optimize computation and communication overlap, including:

  • Automatic profile-guided latency estimation
  • Collective pipelining
  • Heuristics-based collective combiners

These innovations will enable developers to push the boundaries of large-scale training and achieve unprecedented performance and efficiency.


OpenXLA Delivers on TorchBench Performance

OpenXLA has also made significant strides in enhancing performance, particularly on GPUs with key PyTorch-based generative AI models. PyTorch-XLA GPU is now neck and neck with TorchInductor for TorchBench Full Graph Models and has a TorchBench pass rate within 5% of TorchInductor.

A bar graph showing a performance comparison of TorchInductor vs. PyTorch-XLA GPU on Google Cloud NVIDIA H100 GPUs
Figure 3: Performance comparison of TorchInductor vs. PyTorch-XLA GPU on Google Cloud NVIDIA H100 GPUs. “Full graph models” represent all TorchBench models that can be fully represented by StableHLO

Behind these impressive gains lies XLA GPU's global cost model, a game-changer for developers. In essence, this cost model acts as a sophisticated decision-making system, intelligently determining how to best optimize computations for specific hardware. The cost model delivers state-of-the-art performance through a priority-based queue for fusion decisions and is highly extensible, allowing third-party developers to seamlessly integrate their backend infrastructure for both general-purpose and specialized accelerators. The cost model's adaptability ensures that computation optimizations are tailored to specific accelerator architectures, while less suitable computations can be offloaded to the host or other accelerators.

OpenXLA is also breaking new ground with novel kernel programming languages, Pallas and Mosaic, which empower developers to write highly optimized code for specialized hardware. Mosaic demonstrates remarkable efficiency in programming key AI accelerators, surpassing widely used libraries in GPU code generation efficiency for models with 64, 128, and 256 Q head sizes, as evidenced by its enhanced utilization of TensorCores.

A bar graph showing a performance comparison of Flash Attention vs. Mosaic GPU on NVIDIA H100 GPUs
Figure 4: Performance comparison of Flash Attention vs. Mosaic GPU on NVIDIA H100 GPUs.

Modular and Extensible AI Development

In addition to performance enhancements, OpenXLA is committed to making the entire stack more modular and extensible. Several initiatives planned for 2024 include:

  • Strengthening module interface contracts
  • Enhancing code sharing between platforms
  • Enabling a shared high-level compiler flow through runtime configuration and component registries

A flow diagram showing modules and subcomponents of the OpenXLA stack.
Figure 5: Modules and subcomponents of the OpenXLA stack.

These improvements will make it easier for developers to build upon and extend OpenXLA.

Alibaba's success with PyTorch XLA FSDP within their TorchAcc framework is a prime example of the benefits of OpenXLA's modularity and extensibility. By leveraging these features, Alibaba achieved state-of-the-art performance for the LLaMa 2 13B model, surpassing the previous benchmark set by Megatron. This demonstrates the power of the developer community in extending OpenXLA to push the boundaries of AI development.

A bar graph showing a performance comparison of TorchAcc and Megatron for  LLaMa 2 13B at different number of GPUs.
Figure 6: Performance comparison of TorchAcc and Megatron for LLaMa 2 13B at different numbers of GPUs.

Join the OpenXLA Community

If you missed the Dev Lab, don't worry! You can still access StableHLO walkthroughs on openxla.org, as well as the GitHub Gist for the PJRT session. Additionally, the recorded keynote and tutorials are available on our YouTube channel. Explore these resources and join our global community – whether you're an AI systems expert, model developer, student, or just starting out, there's a place for you in our innovative ecosystem.

four photos of participants and vendors at OpenXLA Dev Lab

Acknowledgements

Adam Paszke, Allen Hutchison, Amin Vahdat, Andrew Leaver, Andy Davis, Artem Belevich, Abhinav Goel, Bart Chrzaszcz, Benjamin Kramer, Berkin Ilbeyi, Bill Jia, Cyril Bortolato, David Dunleavy, Eugene Zhulenev, Florian Reichl, Frederik Gossen, George Karpenkov, Gunhyun Park, Han Qi, Jack Cao, Jacques Pienaar, Jaesung Chung, Jen Ha, Jianting Cao, Jieying Luo, Jiewen Tan, Jini Khetan, Kevin Gleason, Kyle Lucke, Kuy Mainwaring, Lauren Clemens, Manfei Bai, Marisa Miranda, Michael Levesque-Dion, Milad Mohammadi, Nisha Miriam Johnson, Penporn Koanantakool, Puneith Kaul, Robert Hundt, Sandeep Dasgupta, Sayce Falk, Shauheen Zahirazami, Skye Wanderman-Milne, Yeounoh Chung, Pratik Fegade, Peter Hawkins, Vaibhav Singh, Tamás Danyluk, Thomas Joerg, TJ Xu, and Tom Natan

By James Rubin – Founding Product Manager, Aditi Joshi – OpenXLA Program Manager, and Elliot English – OpenXLA Technical Lead, on behalf of the OpenXLA Project

Get ready for Google I/O: Program lineup revealed

Wednesday, May 1, 2024


Developers, get ready! Google I/O is just around the corner, kicking off live from Mountain View with the Google keynote on Tuesday, May 14 at 10 am PT, followed by the Developer keynote at 1:30 pm PT.

But the learning doesn’t stop there. Mark your calendars for May 16 at 8 am PT when we’ll be releasing over 150 technical deep dives, demos, codelabs, and more on-demand. If you register online, you can start building your 'My I/O' agenda today.

Here's a sneak peek at some of the exciting highlights from the I/O program preview:

Unlocking the power of AI: The Gemini era unlocks a new frontier for developers. We'll showcase the newest features in the Gemini API, Google AI Studio, and Gemma. Discover cutting-edge pre-trained models from Kaggle, and delve into Google's open-source libraries like Keras and JAX.

Android: A developer's playground: Get the latest updates on everything Android! We'll cover groundbreaking advancements in generative AI, the highly anticipated Android 15, innovative form factors, and the latest tools and libraries in the Jetpack and Compose ecosystem. Plus, discover how to optimize performance and streamline your development workflow.

Building beautiful and functional web experiences: We’ll cover Baseline updates, a revolutionary tool that empowers developers with a clear understanding of web features and API interoperability. With Baseline, you'll have access to real-time information on popular developer resource sites like MDN, Can I Use, and web.dev.

The future of ChromeOS: Get a glimpse into the exciting future of ChromeOS. We'll discuss the developer-centric investments we're making in distribution, app capabilities, and operating system integrations. Discover how our partners are shaping the future of Chromebooks and delivering world-class user experiences.

This is just a taste of what's in store at Google I/O. Stay tuned for more updates, and get ready to be a part of the future.

Don't forget to mark your calendars and register for Google I/O today!

Posted by Timothy Jordan – Director, Developer Relations and Open Source

A look back at BazelCon '23 and the launch of Bazel 7

Tuesday, December 12, 2023

In October ‘23, the Google Bazel team hosted the 7th annual BazelCon, a gathering for the Bazel community and broader Build ecosystem. We welcomed enterprise users and program partners, companies building businesses on top of Bazel, as well as enthusiasts curious to learn more about this space. This year, BazelCon made its debut outside North America and was hosted in the Google Munich office.


BazelCon recap

The Bazel ecosystem is growing. This year, we had over 200 in-person external attendees, over 3K livestream views, and a record number of 120 proposals submitted by the community.

We started the conference with a keynote address by Mícheál Ó Foghlú (Engineering Director at Google), followed by a state-of-the-union address by John Field and Tobias Werth (Engineering Managers at Google).

The Bazel community showcased a series of technical and lightning main-stage talks. To highlight a few:

    • BMW shared insights into how they released several “Bazel cars”
    • JetBrains* announced the preview release of their new Bazel plugin for their IDEs
    • Booking.com walked through their journey of adopting Bazel, thereby reducing CI time from 22 minutes to under 2 minutes and container image size by 80%

Take a look at published recordings of all of these talks at your own leisure.

In addition to hearing from presenters, conference attendees also had the opportunity to engage with each other in smaller, more interactive forums. Through live Q&A with the Bazel team and several Birds of a Feather sessions on topics ranging from authoring rulesets, to collecting usage data responsibly, to IDE integrations, the Bazel community was able to provide direct feedback to the team and spark productive discussions. Make sure to check out published notes from these sessions.

At BazelCon, we also proudly announced the initial release candidate for Bazel 7, which has since launched.


What’s new in Bazel 7?

Bazel 7 is the latest major release on the long-term support (LTS) track. Many multi-year efforts have landed in this release. For example:

Bzlmod: Bzlmod, Bazel's new modular external dependency management system, is now enabled by default (i.e. --enable_bzlmod defaults to true). If your project doesn't have a MODULE.bazel file, Bazel will create an empty one for you. The old WORKSPACE mechanism will continue to work alongside the new Bzlmod-managed system. Learn more about what’s changed since Bazel 6 and what’s coming up in Bazel 8 and 9.

Build without the Bytes (BwoB): Build without the Bytes for builds using remote execution is now enabled by default (i.e. --remote_download_outputs defaults to toplevel). Bazel will no longer try to download any intermediate outputs from the remote server, but only the outputs of requested top-level targets instead. This significantly improves remote build performance. Learn more about BwoB.

Merged analysis and execution (Skymeld): Project Skymeld aims to improve multi-target build performance by removing the boundary between the analysis and execution phases and allowing targets to be independently executed as soon as their analysis finishes.

Platform-based toolchain resolution for Android and C++: This change helps streamline the toolchain resolution API across all rulesets, obviating the need for language-specific flags. It also removes technical debt by having Android and C++ rules use the same toolchain resolution logic as other rulesets. Full details for Android developers are available in the Android Platforms announcement.

Read the full release notes for Bazel 7.
 

Stay up-to-date with Bazel

We are thankful to everyone who played a role in making BazelCon ‘23 a big success - speakers, contributors, attendees, the planning committee, and more. We look forward to seeing you again next year!

In the meantime, follow along as we work together towards Bazel 8:

If you have any questions or feedback, or would like to share something you’ve built, reach out to product@bazel.build. We would love to hear from you!

By the Google Bazel team

*Copyright © 2023 JetBrains s.r.o. JetBrains and IntelliJ are registered trademarks of JetBrains s.r.o

Accessibility best practices for virtual events

Tuesday, December 1, 2020





An image with 4 hands together in the colors: red, blue, green, yellow.
As everyone knows, most of our open source events have transformed from in-person to digital this year. However, due to the sudden change, not everything is accessible. We took this issue seriously and decided to work with one of our accessibility experts, Neighborhood Access, to share best practices for our community. We hope this will help you organize your digital events!

Get ready for BazelCon 2020

Wednesday, November 11, 2020

With only 24 hours to go, BazelCon 2020 is shaping up to be a much anticipated gathering for the Bazel community and broader Build ecosystem. With over 1000 attendees, presentations by Googlers, as well as talks from industry Build leaders from Twitter, Dropbox, Uber, Pinterest, GrabTaxi, and more, we hope BazelCon 2020 will provide an opportunity for knowledge sharing, networking, and community building.

I am very excited by the keynote announcements, the migration stories at Twitter, Pinterest, and CarGurus, as well as technical deep dives on Bazel persistent workers, incompatible target skipping, migrating from Gradle to Bazel, and more. The “sold out” Birds of a Feather sessions and the Live Q&A with the Bazel team will bring the community together to discuss design docs, look at landings, and provide feedback on the direction of Bazel and the entire ecosystem.

We are also pleased to announce that, starting with the next major release (4.0), Bazel will support Long Term Support (LTS) releases as well as regular Rolling releases.

Some benefits of this new release cadence are:
  • Bazel will release stable, supported LTS releases on a predictable schedule with a long window without breaking changes
  • Bazel contributors / rules owners can prepare to support future LTS releases via rolling releases.
  • Bazel users can choose the release cadence that works best for them, since we will offer both LTS releases and rolling releases.
Long Term Support (LTS) releases:
  • We will create an LTS release every ~9 months => new LTS release branch, increment major version number.
  • Each LTS release will include all new features, bug fixes and (breaking) changes since the last major version.
  • Bazel will actively support each LTS branch for 9 months with critical bug fixes, but no new features.
  • Thereafter, Bazel will provide maintenance for two additional years with only security and OS compatibility fixes.
  • Bazel Federation reboot: Bazel will provide strong guidance about the ruleset versions that should be used with each Bazel release so that each user will not have to manage interoperability themselves.
Make sure that you register at http://goo.gle/bazelcon to be a part of the excitement of the premier build conference!

See you all at BazelCon 2020!

By Joe Hicks and the entire Bazel Team at Google

New Case Studies About Google’s Use of Go

Thursday, August 27, 2020

Go started in September 2007 when Robert Griesemer, Ken Thompson, and I began discussing a new language to address the engineering challenges we and our colleagues at Google were facing in our daily work. The software we were writing was typically a networked server—a single program interacting with hundreds of other servers—and over its lifetime thousands of programmers might be involved in writing and maintaining it. But the existing languages we were using didn't seem to offer the right tools to solve the problems we faced in this complex environment.

So, we sat down one afternoon and started talking about a different approach.

When we first released Go to the public in November 2009, we didn’t know if the language would be widely adopted or if it might influence future languages. Looking back from 2020, Go has succeeded in both ways: it is widely used both inside and outside Google, and its approaches to network concurrency and software engineering have had a noticeable effect on other languages and their tools.

Go has turned out to have a much broader reach than we had ever expected. Its growth in the industry has been phenomenal, and it has powered many projects at Google.
Credit to Renee French for the gopher illustration.

The earliest production uses of Go inside Google appeared in 2011, the year we launched Go on App Engine and started serving YouTube database traffic with Vitess. At the time, Vitess’s authors told us that Go was exactly the combination of easy network programming, efficient execution, and speedy development that they needed, and that if not for Go, they likely wouldn’t have been able to build the system at all.

The next year, Go replaced Sawzall for Google’s search quality analysis. And of course, Go also powered Google’s development and launch of Kubernetes in 2014.

In the past year, we’ve posted sixteen case studies from end users around the world talking about how they use Go to build fast, reliable, and efficient software at scale. Today, we are adding three new case studies from teams inside Google:
  • Core Data Solutions: Google’s Core Data team replaced a monolithic indexing pipeline written in C++ with a more flexible system of microservices, the majority of them written in Go, that help support Google Search.
  • Google Chrome: Mobile users of Google Chrome in lite mode rely on the Chrome Optimization Guide server to deliver hints for optimizing page loads of well-known sites in their geographic area. That server, written in Go, helps deliver faster page loads and lowered data usage to millions of users daily.
  • Firebase: Google Cloud customers turn to Firebase as their mobile and web hosting platform of choice. After joining Google, the team completely migrated its backend servers from Node.js to Go, for the easy concurrency and efficient execution.
We hope these stories provide the Go developer community with deeper insight into the reasons why teams at Google choose Go, what they use Go for, and the different paths teams took to those decisions.

If you’d like to share your own story about how your team or organization uses Go, please contact us.

By Rob Pike, Distinguished Engineer

Recapping major improvements in Go 1.15 and bringing the Go community together

The Latest Version of Go is Released
In August, the Go team released Go 1.15, marking another milestone of continuous improvements to the language. As always, many of the updates were supported by our community of contributors in collaboration with the engineering team here at Google.

Following our earlier release in February, the latest Go build brings a slew of performance improvements. We’ve made significant changes behind the scenes to the compiler, reducing binary sizes by about 5%, and improving building Go applications to be around 20% faster and requiring 30% less memory on average.

Go 1.15 also includes several updates to the core library, a few security improvements, and much more–you can dive into the full release notes here. We’re really excited to see how developers like you, ranging from those working on indie projects all the way to enterprise devs, will incorporate these updates into your projects.

A few users have been working with the release candidates ahead of the latest build and were kind enough to share their experience.

Wayne Ashley Berry, a Senior Engineer at Over, shared that “...seeing significant performance improvements in the new releases is incredible!” and, speaking of compiler improvements, showed “one of [their] services compiling ~1.3x faster” after upgrading to Go 1.15.

This mirrored our experience within Google, compiling larger Go applications like Kubernetes which experienced 30% memory reductions and 20% faster builds.

These are just a couple examples of how some users have already seen the benefits of Go 1.15. We’re looking forward to what the rest of the gopher community will do with it!
A Better Experience For Go DevelopersOver the last few months we’ve also been hard at work improving a few things in the Go ecosystem. In July, the VS Code extension for Go officially joined the Go project and more recently, we rolled out a few updates for our online resources.

We brought a few important changes to pkg.go.dev, a central source of information for Go packages and modules. With these changes came functional improvements to make the browsing experience better and minor tweaks across the site (including a cute new gopher). We also made some changes to go.dev—our hub for Go developers—making it easier to navigate the site and find examples of Go’s use in the enterprise.
The new home page on pkg.go.dev. Credit to Renee French for the gopher illustration.
We’ll be bringing even more improvements to the Go ecosystem in the coming months, so stay tuned!
Our Commitment to Open Source and Google Open Source LiveMost of these changes wouldn’t be possible without contribution from our open source community through submitting CLs to our release process, organizing community meetups, and engaging in discussions about future changes (like generics).

Being part of the open source community is something that the Go team embraces, and Google as a whole works to support every year. It’s through this community that we’re able to iterate on our work with a constant feedback loop and bring new gophers into the Go ecosystem. We’re lucky to have the support of passionate Go advocates, and even get to celebrate the occasional community gopher design!

That being said, this has been a challenging year to gather in person for meetups or larger conferences. However, the gopher community has been incredibly resilient, with many meetups taking place virtually, several of which Go team members have been able to attend.

We’d like to help the entire open source community stay connected. In that vein, we’re excited to announce that Google will host a series of free virtual events, Google Open Source Live, every month through next year! As part of the series, on November 7th, members of the Go team will be sharing community updates, some things we’ve been up to, and a few best practices around getting started with Go.

Visit the official site for the Go Day on Google Open Source Live, to learn more about registration and speakers. To keep up-to-date with the Go team, make sure to follow the official Go twitter and visit go.dev, our hub for Go developers.

By Steve Francia – Product Lead, Go Team

Three opportunities to connect with Google Open Source in June

Monday, June 15, 2020

One of our biggest challenges this year has been finding opportunities to stay connected with the many open source communities that we collaborate with across projects. As we continue to develop new ways of creating convenings with our different stakeholders, here are three opportunities to connect with Google Open Source later this month.

24 hours of Google Cloud Talks by DevRel

When: June 23, 2020
What: This is a free, digital series, organized by Google Developer Relations team, offering practitioners an opportunity to connect with our technical experts and deepen their awareness and knowledge of a variety of Google Cloud solutions including ML/AI, Serverless, DevOps, and many more.

Talks by Google Open Source:

June 23

OpenJS World

When: June 23-24, 2020
What: Organized by The Linux Foundation, and sponsored by Google, this annual event brings together the JavaScript and web ecosystem including Node.js, Electron, AMP and more. In 2020, we’re going virtual to learn and engage with leaders deploying innovative applications at massive scale.

Talks by Google Open Source:

June 23
June 24

Open Source Summit North America

When: June 29 – July 2, 2020
What: Organized by The Linux Foundation, and sponsored by Google, this event connects the open source ecosystem under one roof, summoning over 2,000 participants across 15 conference rooms. It’s a unique environment for cross-collaboration between developers, sysadmins, devops, architects, program and product managers and others who are driving technology forward.

Talks by Google Open Source:

June 29
June 30
If you attend any of these talks, and plan to share, you can tag @GoogleOSS on Twitter. We hope to see and connect with many of you at these virtual events!

By María Cruz, Google Open Source

ZuriHac 2018: Haskell hackathon in Rapperswil

Friday, August 17, 2018

Google Open Source recently co-sponsored a three-day hackathon for Haskell, an open source functional programming language. Ivan KriÅ¡to from Google’s Zürich office talks more about the event below.

Over the weekend of June 9th, Rapperswil, Switzerland became a home for 300 Haskellers. Hochschule für Technik Rapperswil hosted the seventh annual ZuriHac, the biggest Haskell Hackathon in Europe. ZuriHac is a free, international coding festival with the goal to expand our community and to build and improve Haskell libraries, tools and infrastructure.

Participants could choose to hack all day long, attend the Haskell beginners course led by Julie Moronuki, join the Glasgow Haskell Compiler (GHC) DevOps track organized by GHC contributors with the goal to bring in new contributors, listen to the Haskell flavoured talks, or socialize and swim in the lake. The event was colocated with C++ standardization committee meetings which offered a unique opportunity for sharing ideas between the two communities.

Here is a short summary of featured talks at ZuriHac.
The event concluded with a presentation of the results of the three day hackathon: project presentations.

Video by Hochschule für Technik Rapperswil.

Once again, we broke the attendance record! We’re already preparing for ZuriHac 2019 and hope to keep up this amazing growth. See you next year!

By Ivan Krišto, Software Engineer

Googlers on the road: CLS and OSCON 2018

Friday, July 13, 2018

Next week a veritable who’s who of free and open source software luminaries, maintainers and developers will gather to celebrate the 20th annual OSCON and the 20th anniversary of the Open Source Definition. Naturally, the Google Open Source and Google Cloud teams will be there too!

Program chairs at OSCON 2017, left to right:
Rachel Roumeliotis, Kelsey Hightower, Scott Hanselman.
Photo used with permission from O'Reilly Media.
This year OSCON returns to Portland, Oregon and runs from July 16-19. As usual, it is preceded by the free-to-attend Community Leadership Summit on July 14-15.

If you’re curious about our outreach programs, our approach to open source, or any of the open source projects we’ve released, please find us! We’re eager to chat. You’ll find us and many other Googlers throughout the week on stage, in the expo hall, and at several special events that we’re running, including:
Here’s a rundown of the sessions we’re hosting this year:

Sunday, July 15th (Community Leadership Summit)

11:45am   Asking for time and/or money by Cat Allman

Monday, July 16th (Tutorials)

9:00am    Getting started with TensorFlow by Josh Gordon
1:30pm    Introduction to natural language processing with Python by Barbara Fusinska

Tuesday, July 17th (Tutorials)

9:00am    Istio Day opening remarks by Kelsey Hightower
9:00am    TensorFlow Day opening remarks by Edd Wilder-James
9:05am    Sailing to 1.0: Istio community update by April Nassi
9:05am    The state of TensorFlow by Sandeep Gupta
9:30am    Introduction to fairness in machine learning by Hallie Benjamin
9:55am    Farm to table: A TensorFlow story by Gunhan Gulsoy
11:00am  Hassle-free, scalable machine learning with Kubeflow by Barbara Fusinska
11:05am  Istio: Zero-trust communication security for production services by Samrat Ray, Tao Li, and Mak Ahmad
12:00pm  Project Magenta: Machine learning for music and art by Sherol Chen
1:35pm    Istio à la carte by Daniel Ciruli

Wednesday, July 18th (Sessions)

9:00am    Wednesday opening welcome by Kelsey Hightower
11:50am  Machine learning for continuous integration by Joseph Gregorio
1:45pm    Live-coding a beautiful, performant mobile app from scratch by Emily Fortuna and Matt Sullivan
2:35pm    Powering TensorFlow with big data using Apache Beam, Flink, and Spark by Holden Karau
5:25pm    Teaching the Next Generation to FLOSS by Josh Simmons

Thursday, July 19th (Sessions)

9:00am    Thursday opening welcome by Kelsey Hightower
9:40am    20 years later, open source is as important as ever by Sarah Novotny
11:50am  Google’s approach to distributed systems observability by Jaana B. Dogan
2:35pm    gRPC versus REST: Let the battle begin with Alex Borysov
5:05pm    Shenzhen Go: A visual Go environment for everybody, even professionals by Josh Deprez

We look forward to seeing you and the rest of the community there!

By Josh Simmons, Google Open Source

Googlers on the road: FOSSASIA Summit 2018

Friday, March 16, 2018

In a week’s time, free and open source enthusiasts of all kinds will gather in Singapore for FOSSASIA Summit 2018. Established in 2009, the annual event attracts more than 3,000 attendees, running from March 22nd to 25th this year.

FOSSASIA logo licensed LGPL-2.1.
FOSSASIA Summit is organized by FOSSASIA, a nonprofit organization that focuses on Asia and brings people together around open technology both in-person and online. The organization is also home to many open source projects and is a regular participant in the Google Open Source team’s student programs, Google Summer of Code and Google Code-in.

Our team is excited to be among those attending and speaking at the conference this year, and we’re proud that Google Cloud is a sponsor. If you’re around, please come say hello. The highlight of our travel is meeting the students and mentors who have participated in our programs!

Here are the Googlers who will be giving presentations:

Thursday, March 22nd

2:00pm Real-world Machine Learning with TensorFlow and Cloud ML by Kaz Sato

Friday, March 23rd

9:30am BigQuery codelab by Jan Peuker
10:30am  Working with Cloud DataPrep by KC Ayyagari
1:00pm Extract, analyze & translate Text from Images with Cloud ML APIs by Sara Robinson
2:00pm    Bitcoin in BigQuery: blockchain analytics on public data by Allen Day
2:40pm What can we learn from 1.1 billion GitHub events and 42 TB of code? by Felipe Hoffa2:40pm Engaging IoT solutions with Machine Learning by Markku Lepisto
2:45pm CloudML Engine: Qwik Start by Kaz Sato
3:20pm Systems as choreographed behavior with Kubernetes by Jan Peuker
4:00pm The Assistant by Manikantan Krishnamurthy

Saturday, March 24th

10:30am Google Summer of Code and Google Code-in by Stephanie Taylor
10:30am Zero to ML on Google Cloud Platform by Sara Robinson
11:00am  Building a Sustainable Open Tech Community through Coding Programs, Contests and Hackathons panel including Stephanie Taylor
11:05am Codifying Security and Modern Secrets Management by Seth Vargo
1:00pm    Open Source Education panel including Cat Allman
5:00pm Everything as Code by Seth Vargo

We look forward to seeing you there!

By Josh Simmons, Google Open Source

Googlers on the road: FOSDEM 2018

Monday, January 22, 2018

The Google Open Source team is currently enjoying summer weather in Sydney at Linux.conf.au, but soon we return to winter weather and head to Brussels for FOSDEM 2018. FOSDEM is a special event, famous for being non-commercial, volunteer-organized, and free to attend. It’s also huge, attracting more than 5,000 attendees.

FOSDEM logo licensed CC BY 2.0 SE.
This year FOSDEM is particularly special as it falls on top of the 20th anniversary of the open source movement and its steward, the Open Source Initiative. (In case you’re wondering, this September will mark the 35th anniversary of the free software movement.) We’re looking forward to celebrating the occasion!

You’ll find us in the hallways, at satellite events, and at our table in the stands area. You’ll also find some Googlers in the conference schedule, as well as folks sharing their experience of the most recent Google Summer of Code and Google Code-in.

If you’d like to say hello or chat, swing by our table in Building K. The highlight of our trip is meeting hundreds of the thousands of students and mentors who have participated in our programs!

Below are the Googlers who will be giving presentations:

Saturday, February 3rd
12:30pm  Google’s approach to distributed systems observability for Go by JBD (also at 2:30pm)
3:05pm    Testing and Validating distributed systems by Holden Karau

Sunday, February 4th
10:20am  Regular Expression Derivatives in Python by Michael Paddon
11:30am  Advocating For FOSS Inside Companies a panel including Max Sills
3:00pm    Your Build in a Datacenter by Jakob Buchgraber
4:00pm    Accelerating Big Data Outside of the JVM by Holden Karau

Hope to see you there!

By Josh Simmons, Google Open Source

Googlers on the road: Linux.conf.au 2018

Monday, January 15, 2018

It’s summer in Sydney and Linux.conf.au (LCA) 2018 is just a week away. LCA, an annual event that attracts people from all over the globe, including Googlers, runs January 22nd to 26th.

LCA is a cornerstone of the free and open source software (FOSS) community. It’s volunteer-run, administered by Linux Australia, and has been running since 1999. Despite its name, the conference program covers all things FOSS. The event is five days long and includes two days of miniconfs that make the program even more interesting.

The Google Open Source team is escaping “wintery” Northern California and will be hosting a Birds of a Feather (BoF) session and co-hosting an event with GDG Sydney, both focused on our student programs.

A few Googlers ended up with sessions in the program and one is running a miniconf:

Tuesday, January 23rd
All day     Create hardware with FPGAs, Linux and Python Miniconf hosted by Tim Ansell (sold out)
11:40am  Learn by Contributing to Open Source by Josh Simmons
5:15pm    Assembling a balsa-wood Raspberry Pi case by Josh Deprez
5:45pm    Streaming Machine Learning with Apache Spark Meetup by Holden Karau

Wednesday, January 24th
1:40pm    Dealing with Contributor Overload by Holden Karau
3:50pm    Securing the Linux boot process by Matthew Garrett

Thursday, January 25th
12:25pm  Google Summer of Code and Google Code-in Birds of a Feather session
6:00pm    Google Summer of Code and Google Code-in Meetup with GDG Sydney

Friday, January 26th
11:40am  The State of Kernel Self-Protection by Kees Cook
1:40pm    QUIC: Replacing TCP for the Web by Jana Iyengar
2:35pm    The Web Is Dead! Long Live The Web! by Sam Thorogood

Not able to make the conference? They’ll be posting session recordings to YouTube afterwards, thanks in part to students who have worked on TimVideos, a suite of open source software and hardware for recording video, as part of Google Summer of Code.

Naturally, you will also find the Google Open Source team at other upcoming events including FOSDEM. We look forward to seeing you in 2018!

By Josh Simmons, Google Open Source

Google Summer of Code 2017 Mentor Summit

Thursday, November 30, 2017

This year Google brought over 320 mentors from all over the world (33 countries!) to Google's offices in Sunnyvale, California for the 2017 Google Summer of Code Mentor Summit. This year 149 organizations were represented, which provided the perfect opportunity to meet like-minded open source enthusiasts and discuss ways to make open source better and more sustainable.
Group photo by Dmitry Levin used under a CC BY-SA 4.0 license.
The Mentor Summit is run as an unconference in which attendees create and join sessions based on their interests. “I liked the unconference sessions, that they were casual and discussion based and I got a lot out of them. It was the place I connected with the most people,” said Cassie Tarakajian, attending on behalf of the Processing Foundation.

Attendees quickly filled the schedule boards with interesting sessions. One theme in this year’s session schedule was the challenging topic of failing students. Derk Ruitenbeek, part of the phpBB contingent, had this to say:
“This year our organisation had a high failure rate of 3 out of 5 accepted students. During the Mentor Summit I attended multiple sessions about failing students and rating proposals and got a lot [of] useful tips. Talking with other mentors about this really helped me find ways to improve student selection for our organisation next time.”
This year was the largest Mentor Summit ever – with the exception of our 10 Year Reunion in 2014 – and had the best gender diversity yet. Katarina Behrens, a mentor who worked with LibreOffice, observed:
“I was pleased to see many more women at the summit than last time I participated. I'm also beyond happy that now not only women themselves, but also men engage in increasing (not only gender) diversity of their projects and teams.”
We've held the Mentor Summit for the past 10+ years as a way to meet some of the thousands of mentors whose generous work for the students makes the program successful, and to give some of them and the projects they represent a chance to meet. This year was their first Mentor Summit for 52% of the attendees, giving us a lot of fresh perspectives to learn from!

We love hosting the Mentor Summit and attendees enjoy it, as well, especially the opportunity to meet each other. In fact, some attendees met in person for the first time at the Mentor Summit after years of collaborating remotely! According to Aveek Basu, who mentored for The Linux Foundation, the event was an excellent opportunity for “networking with like minded people from different communities. Also it was nice to know about people working in different fields from bioinformatics to robotics, and not only hard core computer science.” 

You can browse the event website and read through some of the session notes that attendees took to learn a bit more about this year’s Mentor Summit.

Now that Google Summer of Code 2017 and the Mentor Summit have come to a close, our team is busy gearing up for the 2018 program. We hope to see you then!

By Maria Webb, Google Open Source 

Talk to Google at Node.js Interactive

Monday, October 2, 2017

We’re headed to Vancouver this week, with about 25 Googlers who are incredibly excited to attend Node.js Interactive. With a mix of folks working on Cloud, Chrome, and V8, we’re going to be giving demos and answering questions at the Google booth.

A few of us are also going to be giving talks. Here’s a list of the talks Googlers will be giving at the conference, ranging from serverless Slack bots to JavaScript performance tuning.

Wednesday, October 4th

Keynote: Franzi Hinkelmann
9:40am - 10:00am, West Ballroom A
Franzi Hinkelmann, Software Engineer @ Google

Franzi is located in Munich, Germany where she works at Google on Chrome V8. Franzi, like James and Anna, is a member of the Node.js Core Technical Committee. She speaks across the globe on the topic of JavaScript virtual machines. She has a PhD in mathematics, but left academia to follow her true passion: writing code.

Franzi will discuss her perspective on Chrome V8 in Node.js, and what the Chrome V8 team is doing to continue to support Node.js. Want to know what the future of browser development looks like? This is a must-attend keynote.

Functionality Abuse: The Forgotten Class of Attacks
12:20pm - 12:50pm, West Ballroom A
Nwokedi Idika, Software Engineer @ Google

If you were given a magic wand that would remove all implementation flaws from your web application, would it be free of security problems? If it took you more five seconds to say “No!” (or if, worse, you said “Yes!”), then you’re the target audience for this talk. If you’re in the target audience, don’t fret, much of the security community is there with you. After this talk, attendees will understand why the answer to the aforementioned question is an emphatic “No!” and they will learn an approach to decrease their chance of failing to consider an important vector of attack for their current and future web applications.

High Performance JS in V8
5:20pm - 5:50pm, West Ballroom A
Peter Marshall, Software Engineer @ Google

This year, V8 launched Ignition and Turbofan, the new compiler pipeline that handles all JavaScript code generation. Previously, achieving high performance in Node.js meant catering to the oddities of our now-deprecated Crankshaft compiler. This talk covers our new code generation architecture - what makes it special, a bit about how it works, and how to write high performance code for the new V8 pipeline.

Thursday, October 5th

New DevTools Features for JavaScript
11:40am - 12:10pm, West Meeting Room 122
Yang Guo, Software Engineer @ Google

Ever since v8-inspector was moved to V8's repository, we have been working on a number of new features for DevTools, usable for both Chrome and Node.js. The talk will demonstrate code coverage, type profiling, and give a deep dive into how evaluating a code snippet in DevTools console works in V8.

Understanding and Debugging Memory Leaks in Your Node.js Applications
12:20pm - 12:50pm, West Meeting Room 122
Ali Sheikh, Software Engineer @ Google

Memory leaks are hard. This talk with introduce developers to what memory leaks are, how they can exist in a garbage collected language, the available tooling that can help them understand and isolate memory leaks in their code. Specifically it will talk about heap snapshots, the new sampling heap profiler in V8, and other various other tools available in the ecosystem.

Workshop: Serverless Bots with Node.js
2:20pm - 4:10pm, West Meeting Room 117
Bret McGowen, Developer Advocate @ Google
Amir Shevat, DevRel @ Slack

This talk will show you how to build both voice and chat bots using serverless technologies. Amir Shevat, Head of Developer Relations at Slack, has overseen 17K+ bots deployed on the platform. He will present a maturing model, as best practices, for enterprise bots covering all sorts of use cases ranging for devops, HR, and marketing. Alan Ho from Google Cloud will then show you how to use various serverless technologies to build these bots. He’ll give you a demo of Slack and Google Assistant bots incorporating Google’s latest serverless technology including Edge (API Management), CloudFunctions (Serverless Compute), Cloud Datastore, and API.ai.

Modules Modules Modules
3:00pm - 3:30pm, West Meeting Room 120
Myles Borins, Developer Advocate @ Google

ES Modules and Common JS go together like old bay seasoning and vanilla ice cream. This talk will dig into the inconsistencies of the two patterns, and how the Node.js project is dealing with reconciling the problem. The talk will look at the history of modules in the JavaScript ecosystem and the subtle difference between them. It will also skim over how ECMA-262 is standardized by the TC39, and how ES Modules were developed.

Keynote: The case for Node.js
4:50pm - 5:05pm, West Ballroom A
Justin Beckwith, Product Manager @ Google

Node.js has had a transformational effect on the way we build software. However, convincing your organization to take a bet on Node.js can be difficult. My personal journey with Node.js has included convincing a few teams to take a bet on this technology, and this community. Let’s take a look at the case for Node.js we made at Google, and how you can make the case to bring it to your organization.


We can’t wait to see everyone and have some great conversations. Feel free to reach out to us on Twitter @googlecloud, or request an invite to the Google Cloud Slack community and join the #nodejs channel.

By Justin Beckwith, Languages and Runtimes Team

Professors from Around the World Get Their Students into HFOSS

Friday, July 21, 2017

Over the last four years instructors from around the world have gathered for the Professors’ Open Source Software Experience (POSSE) workshop to integrate open source concepts into their curriculum. At each event, professors make more progress toward providing students with hands on experience via contributions to humanitarian free and open source software (HFOSS).

This year Google was proud to not only host a workshop at our San Francisco office in April, but also to collaborate with the organizers to bring a POSSE workshop to Europe for the first time.
POSSE workshop leaders, from left to right: Clif Kussmaul (Muhlenburg College), Lori Postner (Nassau Community College), Stoney Jackson (Western New England University),  Heidi Ellis (Western New England University), Greg Hislop (Drexel University), and Darci Burdge (Nassau Community College).
The workshop in Italy was led by Dr. Gregory Hislop from Drexel University, and Drs. Heidi Ellis and Stoney Jackson from Western New England University, and brought together 20 instructors from Germany, Hungary, India, Italy, Macedonia, Qatar, Spain, Swaziland, the United Kingdom, and the United States. This was the most geographically diverse workshop to date!
Group photos in San Francisco, USA on April 22, 2017 (left) and Bologna, Italy on July 1, 2017 (right).
What’s next for POSSE? University instructors from institutions in the US can apply now to participate in the next workshop, November 16-18 in Raleigh, NC and join their peers in the community of instructors weaving HFOSS into their curriculum.

By Helen Hu, Google Open Source
.