"These AI tools enable us to study systems at a scale that wasn't possible before," says Komal Singh of Google Research, on our partnership with Geena Davis Institute. Help us create a future where everyone feels seen on (and off) screen. →https://lnkd.in/g6CMqQFV
About us
From conducting fundamental research to influencing product development, our research teams have the opportunity to impact technology used by billions of people every day. We aspire to make discoveries that impact everyone, and sharing our research and tools to fuel progress in the field is fundamental to our approach.
- Website
-
https://research.google/
External link for Google Research
- Industry
- Technology, Information and Internet
- Company size
- 1,001-5,000 employees
Updates
-
Today on the blog, we demonstrate the feasibility & benefits of learning from human feedback for text-to-image generation by obtaining rich human feedback, then training a model to predict this feedback to enable improved image generation. Learn more →https://goo.gle/4c9i2Zz
-
-
We're back with Research@ London. 🇬🇧 The event covered a range of research projects, from PaliGemma, our open vision language model, to Med-Gemini, AI for Breast Cancer, Genomics, Health Equity LLM Toolbox, and Connectomics. Here's a big round of applause for our speakers and attendees for making this another successful event. 👏
-
-
Can AI help us understand representation in media? → https://goo.gle/4bkDr0z Google Research uses #AI to analyze massive amounts of media content, uncovering hidden patterns in representation. We partnered with the Geena Davis Institute & University of Southern California to inform a more equitable media landscape.
-
That's a wrap on Research@ Munich! 🙌 Speaker sessions highlighted some of the most groundbreaking advancements in AI, from climate to health, privacy, and quantum computing. Research@ is a For Researcher, By Researcher event series; thank you to all for the engaging presentations and discussions! → https://goo.gle/4cjONmM
-
-
We believe in the importance of supporting diverse perspectives and empowering innovators who are harnessing AI to make a positive impact on the world. So we're thrilled to celebrate the 2024 Google for Startups Founders Fund recipients — a group of promising Black and Latino-led companies pushing the boundaries of what's possible with AI. From healthcare to education, these exceptional startups are tackling real-world challenges head-on. Learn more about this year's cohort below.
Introducing the 2024 Google for Startups Founders Funds recipients in the United States 🌟 From boosting crop yields to revolutionizing how businesses secure their data, these visionary Black and Latino-led startups are harnessing AI to tackle critical issues across sustainability, enterprise, cybersecurity, and beyond. Congratulations Akeptus, Bountiful, Cambio, EdVisorly, HEARD by Elis, HacWare, Hue., InOrbit.AI, JustAir, Maverick, Pagedip, Raincoat, Sensagrate, Sortile,TackleAI, Trustible, Improving Aviation, Beta Financial Services, Hire Henry and Waterplan (YC S21)! Read more about the 2024 Founders Funds recipients here: https://lnkd.in/gu94sBjE
-
Introducing the Meeting Information Seeking Dialogs dataset (MISeD), which can be used to fine-tune model agents that support natural language conversations about meeting recordings, so users can catch up on meetings they may have missed. Learn more at https://goo.gle/4caXu2O
-
-
Congratulations to the authors of the “Rich Human Feedback for Text-to-Image Generation” paper, which received the #CVPR2024 Best Paper Award. Check out the paper at: https://lnkd.in/gHePsEju
-
-
Congratulations to Zhengqi Li, Richard Tucker, Noah Snavely, and Aleksander Holynski. Their paper “Generative Image Dynamics” received the #CVPR2024 Best Paper Award. Read the paper: https://lnkd.in/g8hSd-3F
-
-
At 3:45 PM today, the #CVPR2024 Google booth will host Hritik Bansal and Yonatan Bitton for a talk on how VideoCon provides a framework to curate high-quality video-text data – enabling video-centric learning, and comprehensive coverage of semantic variations.
-