Advertisment

Strengthening Presence: Confluent's Strategic Approach to the Indian Market

Explore Confluent's strategic approach to the Indian market with Kai Waehner, Global Field CTO at Confluent, as he discusses the company's initiatives, investments, and upcoming projects in an exclusive interview with Ciol.

author-image
Manisha Sharma
Updated On
New Update
Image

Confluent

Confluent is the data streaming platform that is pioneering a fundamentally new category of data infrastructure that sets data in motion. Confluent’s cloud-native offering is the foundational platform for data in motion—designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, organizations can meet the new business imperative of delivering rich, digital front-end customer experiences and transitioning to sophisticated, real-time, software-driven backend operations.

Advertisment

Kai Waehner is Global Field CTO at Confluent. He works with customers across the globe and with internal teams like engineering and marketing. Kai’s main area of expertise lies within the fields of Data Streaming with Apache Kafka and Apache Flink, Analytics, AI/Machine Learning, Messaging, Integration, Microservices, Internet of Things, Stream Processing and Blockchain. Kai is an author and regular speaker at international conferences such as Devoxx, ApacheCon and Kafka Summit, writes articles for professional journals, and shares his experiences with new technologies on his blog.

Kai Waehner, Global Field CTO at Confluent, spoke with Ciol to discuss Confluent's strategic approach to the Indian market, detailing the company's initiatives and investments to establish a strong presence, as well as upcoming projects.

Here's the detailed discussion:

Advertisment

Can you elaborate on the specific initiatives and investments Confluent has made to establish a strong presence in the Indian market?

The Indian market is growing significantly, which is no surprise to us due to the extensive digitalization and innovation occurring here. From a conference perspective, we have seen a substantial number of attendees. In terms of investment, our development and engineering teams in India have expanded considerably, with around 20% of our employees now based in the APAC region, many of whom are in India.

Additionally, from a field perspective, we have a strong presence in sales and marketing, exemplified by our participation in events like the Kafka Summit. This is a significant investment for us, hosting such events in India for the first time. Previously, we only held Kafka Summit and similar events in Europe and the US.

Advertisment

Our commitment to India includes bringing in our senior executives to engage with customers, partners, and the broader community, as we maintain a community-driven business model. This also involves interactions with the press and research analysts. Overall, it is a substantial investment in our engineering teams and these significant events, which we plan to continue supporting.

Can you outline the key components of Confluent's product roadmap aimed at fulfilling this vision?

Our objective is to deliver a complete data streaming platform. Our vision is that everything should be event-driven, allowing actions to be taken when they are most valuable. Real-time data outperforms slower data in almost every use case, and our platform is designed to meet this need. This long-term vision has been a guiding principle since our CEO founded the company, and it is already reflected in our current capabilities.

Advertisment

From a product perspective, our vision is built on three pillars: completeness, ubiquity, and cloud nativity. These principles ensure that we bring data streaming wherever it is needed by our customers, regardless of the industry. In India, the landscape is similar to other regions. For example, the public sector and financial services often operate on-premises, while digital-native companies, like ride-hailing apps and food delivery services, operate entirely in the cloud. We cater to all these customers, delivering data streaming solutions everywhere.

Our commitment extends beyond product offerings to substantial investments in India. We are enhancing our engagement with system integrator partners and developing comprehensive solutions. One such initiative is "Build with Confluent," which supports partners in creating solutions. A notable example is a payment solution vendor called Mindgate, which enables real-time payments and is used by Indian banks. Confluent serves as the underlying technology for their solution. This approach is a key component of our sales strategy, allowing us to expand our business by working closely with partners to deliver complete business solutions.

How does Confluent adapt its offerings to cater to the evolving needs of industries embracing real-time data processing?

Advertisment

Confluent, from the beginning, was built to be real-time by nature. This is the main idea of our business and how it differentiates from many other traditional solutions like data lakes or data warehouses. One of the big strengths of our product and future strategy is that we unify transactional and analytical workloads. Our platform's core is event-based and transactional, allowing connections to payment systems and order management systems. Additionally, our platform supports analytics, either through our services, such as Apache Flink as a cloud offering, or by feeding data into other systems.

We partner with a variety of vendors, including cloud providers and other vendors like Snowflake, Databricks, and MongoDB. Our approach is to strengthen businesses by not just providing connectors but deeply building integrations as a first-class product. For instance, we discussed Tableflow, which unifies integration based on the Apache Iceberg table format, becoming the de facto standard for data sharing.

The significant benefit for the customer is that they only need to set it up once, and everyone can consume it. As a consumer, you always have the freedom of choice. This is crucial for us as we position ourselves as the central nervous system of real-time data. We can process it ourselves and share it with any other application that needs to consume it.

Advertisment

What are the primary advantages of adopting a cloud-native approach to data streaming, particularly in the context of Confluent's solutions?

Cloud-native means having characteristics that are elastic, flexible, and scalable. These benefits allow businesses to start small and scale up if successful.

Customers such as Meesho and Swiggy are perfect examples of starting small in the cloud and scaling up when successful without needing to rearchitect the system. This is a significant benefit of cloud-native solutions.

Advertisment

This is also why we re-engineered Kafka to be cloud-native. Our Kora engine separates compute from storage, supports automatic rebalancing, and includes many other features. This is the big value of a cloud-native solution.

Additionally, we have incorporated these characteristics into our Confluent Cloud offering, which is a serverless, fully managed service available on all major cloud platforms so that you consume it as a service with critical SLAs. This is highly relevant for the Indian market. Even for customers who cannot move to the cloud yet, such as banks or the public sector, our Confluent Platform offers many cloud-native capabilities for on-premise use. Features like tiered storage and self-balancing clusters allow these customers to scale easily and enjoy other cloud-native advantages.

This investment is crucial for critical customers who are not ready for the cloud yet.

Are there any specific considerations for businesses when they decide to migrate to Confluent cloud, particularly in terms of scalability and cost-effectiveness?

Customers migrate to us not only from open-source Kafka but also from other Kafka vendor deployments. This is precisely because of the benefits of the cloud. A big benefit of Confluent Cloud is that it's consumption-based. You pay as you go; the more you use, the more you pay. Also, during extreme cases like big holidays or events such as Christmas for example, retail and ride-hailing demands go up. With Confluent Cloud, you can scale up to meet these demands and pay more during these peak times. Afterwards, as demand decreases, you can scale down, enjoying the flexibility that cloud services offer.

In the past, when deploying on-premise, you needed to scale for peak demands by buying hardware for events like Christmas, even though 80% of that remained unused most of the time. The flexibility of Confluent Cloud allows businesses to operate without interruptions, ensuring that you never have to turn away customers due to capacity issues. This is one technical reason why businesses transition to Confluent Cloud, as a fully managed service.

Additionally, there's the aspect of cost-effectiveness. We can calculate the total cost of ownership (TCO), which is crucial for our customers to understand. Our business value team helps customers understand that it's not just about comparing self-hosted Kafka with Confluent Cloud. It's about considering the end-to-end TCO. Open-source Kafka requires full-time employees to manage it 24/7, along with the associated costs and risks of fixing issues when they arise. Confluent Cloud offers end-to-end cost-effectiveness, helping customers save significantly, as proven by many of our existing clients.

And as discussed during the Kafka Summit in India, we announced an even more important development: the introduction of "Freight clusters." This is a game-changer for our customers. Previously, the way Confluent Cloud was architected was mainly for critical systems, offering 99.99% SLAs and so on, which came with a higher price point. Now, with Freight clusters, customers can use Confluent Cloud, reducing costs by up to 90% compared to the previous pricing. These clusters are designed for non-critical but important use cases, often involving high volumes, such as log analytics or clickstream analytics. These scenarios don't require the same level of SLA, so a slight decrease in speed and SLA is acceptable.

Customers now have the choice between the premium clusters for critical workloads, with high SLAs, and the more cost-effective Freight clusters for data transportation and less critical tasks. This flexibility allows them to optimize costs according to their specific needs.

In what ways does Confluent address data privacy and security concerns for its Indian clients, especially those dealing with sensitive customer information?

This is super critical, because without data privacy and compliance, banks wouldn't buy Confluent, it's that easy. So from the beginning, enterprise readiness was the most important thing, because banks only can deploy Confluent if it's secure, if it has data privacy in mind, if it's compliant.

This is why I always compare Apache Kafka to a car engine. Because it's a core product that you need to build things like compliance and security around. But, with Confluent, you more or less buy the complete car. So it's safe, it's secure, it's connected, and it has 24/7 support.

The key piece of our platform is data security and privacy, so we have capabilities like access control, audit logs, encryption on attribute-level data contracts with policy enforcement, and the list goes on and on. And with these kinds of capabilities, Confluent customers can deploy even their most critical workloads, even in Confluent Cloud and not just on-premise.

In India, Mindgate which does payment solutions is a great example of that. This is some of the most critical and regulated data in different countries, not just in India, but now, this data is being used for payments.

Are there any intriguing upcoming projects that you're currently involved in?

Open-source Kafka which we significantly invest in, has a great deal of innovation happening in this space. For example, Queues for Kafka represent an exciting new development that is coming soon. There will also be another feature in the future that will see the introduction of transactional actions using a two-phase commit protocol which enables more complex transactional workflows.

Additionally, with Confluent, looking back at previous Kafka Summits and events, we have announced numerous features. For example, at the Kafka Summit in London a few weeks ago, we made several announcements about cost reductions, and making Confluent Cloud offering more affordable. This optimization of our infrastructure aims to onboard more and more customers, ensuring they not only receive the best product but also the right price, making it a valuable product for them.