Author Archives: DBDS Admin

8th Stat4Onc Annual Symposium 5/16-17

We are excited to announce the opening registration website for the 8th Stat4Onc Annual Symposium, which will take place May 16-17, 2025, at Stanford University, Stanford, CA, USA.

The Stat4Onc Annual Symposium is an NCI sponsored conference that fosters interdisciplinary discussions between clinical and quantitative scientists on cancer clinical trials. Researchers from academia, industry, and regulatory agencies are invited to share their latest research, explore novel ideas, and collaborate on solutions for enhancing trial design, drug development, and patient care.
Find more details here: The Stat4Onc Annual Symposium

Five short courses are offered on May 15 and May 18. Please visit registration link for short course registration fees.

REGISTER NOW!

DBDS Open House May 6

We’re excited to announce DBDS OPEN HOUSE 2025 – and we’d love for you to join us!

Date: May 6, 2025
Time: 4pm-6pm
Location: Edwards Building, 300 Pasteur Drive, 3rd floor, Stanford, CA 94305

Nima Agheepour in Nature Medicine: AI can help doctors give intravenous nutrition to preemies, Stanford Medicine study finds

An algorithm that learned from tens of thousands of nutrition prescriptions for premature babies could reduce medical errors and better identify what nutrients the smallest patients need.

You may remember that Nima presented this work at the C&C in January.
Read it here: https://med.stanford.edu/news/all-news/2025/03/prematurity-nutrition0.html

Zou group’s TextGrad method for self-improving generative AI is published in Nature

Abstract: Recent breakthroughs in artificial intelligence (AI) are increasingly driven by systems orchestrating multiple large language models (LLMs) and other specialized tools, such as search engines and simulators. So far, these systems are primarily handcrafted by domain experts and tweaked through heuristics rather than being automatically optimized, presenting a substantial challenge to accelerating progress. The development of artificial neural networks faced a similar challenge until backpropagation and automatic differentiation transformed the field by making optimization turnkey. Analogously, here we introduce TextGrad, a versatile framework that performs optimization by backpropagating LLM-generated feedback to improve AI systems….
Read it here: https://www.nature.com/articles/s41586-025-08661-4