Greek Architectures of Data Processing
Three popular Data processing architecure for big and small data on cloud. These cover various scenarios for both batch, realtime, small and big data. The links take to the dedicated blog for each architecture
Three popular Data processing architecure for big and small data on cloud. These cover various scenarios for both batch, realtime, small and big data. The links take to the dedicated blog for each architecture
In this section we go Azure Databricks and create the cluster and notebook to ingest the data in real-time and process and visualize the stream
Databricks is becoming the new normal in data processing technologies in cloud, both Azure and AWS. This is step by step guide to get started on Realtime (streaming) analytics using spark streaming on Databricks
The key steps organizations can take to cross that hurdle/chasm and move ahead of the roadblock and prepare the foundation which will enable them to move along the curve
There are some gaps in data management and maintenance space in Azure. Following are the two things that I feel are missing from the current landscape of Azure and will hopefully be addressed soon
Imagine a scenario where we can maintain an immutable persistent stream of data and instead of processing the data twice, we can use the stream to replay the data for a different time using the code. That is the premise of Kappa architecture
The key reasons for the need of good data lake structure are: 1) Security: need of role-based security on the lake for read access. 2) Extendibility: it should be easy to extend the lake after first round and more systems can be added 3) Usability: it should be easy to use and find the data in the lake and the users should not get lost 4) Governance: it should be simple to apply governance practices to the lake in terms of quality, metadata management and ILM
Lambda architecture is a data-processing architecture designed to handle massive quantities of data by taking advantage of both batch and stream-processing methods. This approach of architecture attempts to balance latency, throughput, and fault-tolerance by using batch processing to provide comprehensive and accurate views of batch data, while simultaneously using real-time stream processing to provide views of online data.
Cost Management solution in Azure helps in monitoring, optimizing and controlling costs of Azure Resources in the subscription and Resource Groups. Cost Management shows organizational cost and usage patterns with advanced analytics.
Databricks has become the new normal in the data processing in cloud. If you are using or plan to use Azure Databricks, this post is will guide you on some interesting things that you can plan to investigate as you start.