Come join us for a discussion on decomposing the data monolith via Data Mesh and Event Streaming. Speaker James Gollan will explain how an event-driven data mesh built on top of Apache Kafka® provides the optimum way to access important business data and unify the operational and analytical planes. The presentation will culminate in a live demonstration from a product owner’s perspective, browsing the data catalog to build a customer service application.
In the second session for the day, Harshana will speak about how MuleSoft Datagraph allows you to build a unified schema to simplify your API/Application Network. In addition, the session will explain how Datagraph allows you to define relationships between different Data Products in your Business domains and how it helps the consumers outside of the specific domains to self-service these Data products in a well-governed standardized manner.
We'll have food, drinks, and prizes in the evening so we look forward to seeing you there!
Manager of Enterprise Solutions Engineering, ANZ
Senior Success Architect
20+ years experience with Open Source software, 3+ years with Confluent
#Linkedin | https://www.linkedin.com/in/jamesgollan/
12+ years of experience in Integration including Financial Services sector and Integration product vendors.
LinkedIn | https://au.linkedin.com/in/harshana05
Integration professional having 9 years of work experience in Retail, Appliances, Healthcare and Telecom domains.
Roles include – RPA Developer, Integration Developer, Integration Application Support, and Architect.