Data Driven and Connectedness
Getting Connected with ONgDB
Moving Beyond Big Data with ONgDB
Enterprises today are amassing data at a faster rate than ever before and largely this data flows into a data warehouse or data lake or just individual databases where it sits. With enterprises struggling to leverage it in a holistic and meaningful way for their business, the appeal of “big data” is waning. So how do enterprises begin moving beyond big data?
In the last years we’ve seen enterprises acting on the acknowledgement that organizations need to be more data driven, but there is still a gap of how to really do that well. In the years ahead we’ll see an increasing push by organizations implementing new technologies promising to get them there. It’s unfortunate, but I fear many organizations will be left disappointed. Disappointed not by the technology getting in place successfully, but by the lack of real business value derived from it.
Many technical architects and business people alike have been captivated by the size and speed of data. However, when it comes to knowledge and understanding these are not the most important parameters. Technologies that are scale first place the focus in the wrong area for solving knowledge and understanding problems. What do you gain by writing 1 billion rows per day into redshift if you don’t have that data connected in a meaningful way to rest of your organization? (Now there is definitely a time and place for just getting data persisted, but that’s a very different scenario than a BI/Recommendation/Analytics conversation about driving business understanding and decision making). Ideally you’d be doing both: getting your data persisted and connected at the same time. Too many enterprises are simply collecting and hoarding data at this point.
Being data driven with a concrete understanding of your organization is completely dependent on connectedness. It’s all about what things are connected to and understanding the ways they’re connected. This is the essential foundation of any business intelligence, cognitive, or predictive analytics. Without understanding at the core there is no movement beyond just having vast amounts of “big data”.
ONgDB provides and advantage in managing data due to the ease in which ONgDB can be brought into your data architecture and the short period of time needed to start seeing benefits of connecting data from your big data deluge. A flexible graph data model that is very representative of the real-world is what ONgDB has been designed from the ground up as a native graph database to read and write.
ONgDB is an ACID-compliant, transactional native graph database that guarantees reliability of the data written while providing horizontal read scalability through distributed high availability clustering. A major benefit to ONgDB being a native graph database is the ability to perform constant time traversals through your connected data. No more JOIN pain. This aspect of being native is important for a graph database because the paradigm for reading and writing data in a graph database is different from any other database type. Be wary of non-native “graph databases” because underneath that graph layer is a datastore that hasn’t been designed to deal with a highly connected graph data model.
Often times, we determine meaningful relationships between information items in advance and structuring our analytics and queries on forward-based predictions to decipher the meaning of our world. But with ONgDB, it encourages us to see the world as a connected data set whose links are made in a dynamic matter and explored in an ad-hoc manner over time.
The first step in moving beyond big data is connecting that data in a meaningful way. As a software and solution architect that needs to actually deliver real solutions I speak from experience about this. We’ve deployed sophisticated ETL pipelines feeding straight into ONgDB with over 640 million writes per day sustained on a c4.4xlarge in AWS.
As you determine the data that is relevant to connect, rest assured that ONgDB will be able to be a front-line participant in receiving your data flow so that you can leverage those connections immediately and begin moving beyond big data.