• Dev Notes
  • Posts
  • New California Law Requires Computer Science in High Schools

New California Law Requires Computer Science in High Schools

Good morning! The future of technology education in California is coming into focus with a new law requiring computer science courses for high school graduation by 2031. Meanwhile, MIT researchers have developed a method to enable AI chatbots to converse smoothly for extremely long periods without performance degradation. And MongoDB is modernizing real-time data processing by allowing stream analysis directly inside their Atlas cloud database.

New California Law Requires Computer Science in High Schools

Palo Alto Online

Context : Only 5% of California high schoolers currently take computer science classes, despite the state having over 45,000 open computing jobs with an average salary of $153K. This gap between tech workforce supply and demand is what AB 2097 aims to address.

AB 2097 is a new bill introduced to make computer science a high school graduation requirement by 2031. The goal is to align education in California with the needs of its massive technology industry. 27 other states already have similar CS education requirements.

Considerations

  • Equitable access is an issue - currently only 30% of California CS students are female and underrepresented minorities have less access to CS courses

Potential Impact: Passing this bill could drive investment in areas like educational software, credentialing programs, and devices to enable CS learning. It may also kickstart initiatives to make computer science engaging and inclusive for all students.

The bottom line is that while AB 2097 signals growing priority for tech skills, its success depends greatly on commitment to equitable execution across all California schools.

Read More Here

New Method Lets AI Chatbots Talk All Day Without Slowing Down

Christine Daniloff, MIT

MIT researchers have developed a new technique called StreamingLLM that solves a major problem with AI chatbots. Chatbots powered by large language models often crash or slow down drastically during long conversations as their short-term memory cache fills up and has to remove old word tokens.

The key innovation with StreamingLLM is that it keeps special "attention sink" tokens in the cache even when it becomes full. These attention sinks create stable model dynamics that enable smooth conversation flows. Specifically:

  • 4 sink tokens are added to the start of the chatbot's cache

  • Other tokens can enter and exit the cache around these sinks

  • Keeps word positional info steady when cache changes

Together, these modifications let the models handle extremely long conversations with over 4 million words without performance issues. Benchmarking showed StreamingLLM processes new words 22x faster than previous methods when cache size grows large.

This consistent high efficiency makes the approach very promising for deploying chatbots commercially as AI assistants for writing, coding, customer service, and more.

Read More Here

MongoDB Modernizes Streaming with Atlas Stream Processing

MongoDB

MongoDB has introduced a new capability called Atlas Stream Processing that allows you to process data streams directly within their Atlas cloud database service. This has the potential to greatly simplify real-time data processing.

What is Atlas Stream Processing?: Atlas Stream Processing lets you continuously process high-velocity streams of event data using MongoDB's powerful aggregation framework and document model. This means you can filter, transform, and analyze complex event streams in real-time to power reactive applications.

Key Benefits

  • Unified platform - Process streaming data alongside database data using the same MongoDB API and tools you already know

  • Flexible data model - Handle messy, nested data without needing to preprocess

  • Managed service - Avoid infrastructure overheads, Atlas handles it for you

  • Low latency - Optimized for speed with native Kafka integration

Reliability is also a key emphasis with Atlas Stream Processing. It automatically catches potential issues like corrupted messages to avoid failures and data loss. This means you can focus on the processing logic rather than building lots of plumbing.

Example Use Cases: There are many possibilities for real-time stream processing opened up by Atlas Stream Processing, including:

  • Personalized recommendations that update as customer behavior changes

  • Real-time alerting based on IoT sensor data  

  • Fraud detection during financial transactions

  • Maintaining aggregated dashboards that are always up-to-date

Read More Here

🔥 More Notes

Youtube Spotlight

The Data Structure You Use Matters a Lot

Click to Watch

Forrest discusses the importance of choosing the right data structure for implementing the undo-redo function. He explores the use of arrays, linked lists, hash tables, queues, and stacks, highlighting their advantages and drawbacks in managing the history of changes in an application.

Was this forwarded to you? Sign Up Here

Reply

or to participate.