The Cloud Bigtable NoSQL database service delivers single-digit millisecond latency and an availability SLA up to 99.999%. It is highly scalable and completely managed. It is a suitable option for applications like real-time analytics, gaming, and telephony that need high throughput and low latency.
With the help of a feature called Cloud Bigtable change streams, you can simply access and connect your Bigtable data with other systems while keeping track of changes made to it. With change streams, you may record database changes for multi-cloud scenarios and migrations to Bigtable, replicate changes from Bigtable to BigQuery for real-time analytics, use Pub/Sub to trigger behavior in downstream applications (for event-based data pipelines), and more.
A potent technology called Cloud Bigtable change streams may assist you in obtaining additional value from your data.
Bigtable is used by NBCUniversal’s Peacock streaming service to handle identities throughout their platform. They were able to streamline and improve their data pipeline with the use of Bigtable change streams.
Bigtable change streams were easy to incorporate into our current data pipeline using the dataflow beam connector to notify of changes for processing further down the line. This change significantly sped up the processing we had to do in order to achieve our data normalization goals.
Making your data changes happen
Through the Google Cloud interface, the API, client libraries, or declarative infrastructure tools like Terrafom, enabling a change stream on your table is simple.
All data updates to a table will be recorded and saved for up to seven days after it has been activated. This is helpful for auditing reasons or for monitoring changes to data over time. You may alter the retention time to suit your own requirements. Using the Bigtable connection for Dataflow, you may create unique processing pipelines. As a result, you may use batch processing, streaming processing, and machine learning to handle data in Bigtable. Or, you may directly integrate with the Bigtable API for even more freedom and control.
Use scenarios for Cloud Bigtable change streams
For a variety of use cases and mission-critical workloads, change streams may be used.
ML and analytics
Gathering and real-time analysis of event data. This may be used to monitor user activity to update feature store embeddings for personalization, to analyze system performance in IoT services to find bugs or spot security concerns, or to keep an eye on events to spot fraud.
Change streams may be used in BigQuery to monitor changes to data over time, spot patterns, and provide reports. For large-scale analytics, you may replicate your data on BigQuery or transmit change records as a collection of change logs to BigQuery.
Event-based software
Utilize change streams to make particular events activate downstream processing, for instance in gaming to monitor player activities in real-time. This may be used to update the status of the game, provide players feedback, or catch cheaters.
Retailers use change streams to track catalog changes like availability or price and inform consumers with updates.
Multi-cloud and migration
Record Bigtable updates for situations involving several clouds or a hybrid cloud. Use Bigtable HBase replication tools and change streams, for instance, to maintain data replication between cloud or on-premises databases. This architecture may also be used to migrate to Bigtable online without interfering with serving operations.
Compliance
Meeting the standards of certain legislation, such as HIPAA or PCI DSS, is sometimes referred to as compliance. By giving you a record of all data changes, maintaining the change log may assist you in proving compliance. This may be useful if you need to look into a security issue or conduct an audit.
Study more
Change streams is a potent feature that gives you the opportunity to act on your Bigtable data in various ways to satisfy your business needs and improve your data pipelines.
[…] workflow, engineers use Bigtable to store enormous amounts of transactional and analytical data. The introduction of Bigtable change streams, which will improve these data workflows for event-based architectures and offline processing, is […]