Introduction:
Hi, I’m Dhara Parikh, a Senior Database Administrator and Gen AI Engineer exploring AI and data innovation — building intelligent, data-driven applications using modern frameworks.
With experience across platforms like SQL Server and DB2, I’m passionate about bridging traditional database management with AI-powered solutions. My PostgreSQL journey began with a pgvector use case, where I integrated vector similarity search with Azure OpenAI to enable smarter data discovery.
PostgreSQL inspires me for its blend of reliability and innovation — empowering professionals like me to turn data into intelligent solutions.
Journey in PostgreSQL
My PostgreSQL journey began when I was encouraged to explore how natural language could be used to interact with databases.
This led me to pgvector and other emerging AI capabilities in PostgreSQL — including semantic search, sentiment analysis, text embeddings, and similarity matching.
It’s been exciting to see how PostgreSQL can combine data intelligence with AI innovation, making it a powerful platform for the next generation of applications.
Can you share a pivotal moment or project in your PostgreSQL career that has been particularly meaningful to you?
A key moment in my PostgreSQL journey was building a Q&A application using pgvector to enable natural language interaction with data. It improved accuracy, made insights more accessible, and reduced infrastructure costs by around 40%. This project showed me how PostgreSQL can power AI-driven and cost-efficient solutions.
Contributions and Achievements:
One of the contributions I’m most proud of is designing a Generative AI application powered by PostgreSQL.
The goal was to make data interaction more intuitive — allowing users to query information in natural language. Using pgvector as the foundation, I stored and searched text embeddings generated through Azure OpenAI, and integrated it with the LangChain framework to orchestrate context-aware retrieval and response generation.
The solution enabled semantic search, knowledge discovery, and intelligent Q&A over structured and unstructured data — all within PostgreSQL. What made it meaningful was seeing how a traditional database could evolve into a modern vector intelligence layer, bridging enterprise data with AI in a seamless, scalable way.
This project reaffirmed my belief that PostgreSQL is not just a database — it’s a powerful AI-ready platform capable of driving the next wave of intelligent data experiences.
(II) Have you faced any challenges in your work with PostgreSQL, and how did you overcome them?
Yes, I’ve faced a few challenges while working with PostgreSQL, but each one has helped me grow and understand the system more deeply.
At times, handling large datasets with multiple users led to slow-running queries and indexing issues. To resolve this, I focused on analyzing execution plans, tuning queries, and adjusting autovacuum settings to prevent table bloat. These small but consistent improvements helped make PostgreSQL run faster and more efficiently under heavy workloads.
When I started using PostgreSQL for AI workloads like semantic search and embeddings, it was a new learning curve. Managing large vector data and similarity searches efficiently required experimentation. I worked with the pgvector extension, explored indexing techniques, and fine-tuned storage and query settings to improve search performance and accuracy.
Each challenge reinforced my belief that PostgreSQL is not just a database — it’s a flexible and evolving platform that can handle both enterprise workloads and modern AI-driven applications with the right tuning and understanding.
Community Involvement:
I engage with the PostgreSQL community by sharing my learning experiences, PoC outcomes, and use cases that combine AI and PostgreSQL, such as semantic search and vector similarity. I also follow community discussions, blogs, and updates to stay informed about new releases and best practices.
Recently, I’ve started presenting my work and insights to encourage others to explore PostgreSQL’s capabilities in AI-driven and modern data workloads.
(II) Can you share your experience with mentoring or supporting other women in the PostgreSQL ecosystem?
I’ve been actively supporting and mentoring other women in the PostgreSQL community through webinars, engineering exchanges, and knowledge-sharing sessions.
I also enjoy writing Medium articles and presenting my PostgreSQL use cases, especially where it connects with AI and ML applications.
These interactions are a great way to share learning, encourage experimentation, and help more women gain confidence in exploring modern innovations with PostgreSQL.
Insights and Advice:
My advice to women starting their careers in technology, especially in database management and PostgreSQL, is to embrace the unknown — that’s where most growth happens. Every challenge you take on will strengthen your skills and confidence.
Keep learning, experiment with emerging areas like AI/ML-driven data management, and share your journey with others.
Most importantly, believe in yourself — self-belief is what turns obstacles into opportunities and curiosity into expertise.
(II) Are there any resources (books, courses, forums) you’d recommend to someone looking to deepen their PostgreSQL knowledge?
I’ve found “PostgreSQL: Up and Running” by Regina Obe & Leo Hsu and “PostgreSQL 13 Cookbook” (O’Reilly) extremely helpful in building both foundational and advanced DBA skills. I also stay connected through the PostgreSQL community forums, which are great for learning, sharing experiences, and exploring new ideas.
Looking Forward:
Future AI developments in PostgreSQL are exciting due to the integration of asynchronous I/O in PostgreSQL 18, which offers significant performance improvements ideal for AI workloads, and enhanced indexing and query optimization features that support complex AI-driven data processing. Looking ahead, PostgreSQL is evolving to better handle AI applications by combining efficient vector search capabilities, cloud-native storage integration, and advanced hybrid query techniques. These enhancements make PostgreSQL a powerful, AI-ready database platform for scalable and intelligent applications.
(II) Do you have any upcoming projects or goals within the PostgreSQL community that you can share?
Yes, we’re currently working on an exciting project — building a database migration tool to seamlessly move Oracle workloads to PostgreSQL on Amazon RDS.
What makes it unique is the use of AI-driven automation, leveraging frameworks like LangGraph and agentic workflows to simplify schema conversion, query translation, and performance optimization.
The goal is to make enterprise migrations smarter, faster, and more resilient using PostgreSQL as the target platform.
Personal Reflection:
Being part of the PostgreSQL community means belonging to a network that fosters learning, collaboration, and innovation. I truly appreciate how the community constantly supports growth, visibly advances the technology, and values people’s contributions. Working on AI-driven Oracle-to-PostgreSQL migration tools has shown me how this collective effort turns complex ideas into practical, real-world solutions.
(II) How do you balance your professional and personal life, especially in a field that is constantly evolving?
I plan my day intentionally, guided by the principles of Atomic Habits by James Clear — focusing on small, consistent actions that drive meaningful progress. I make it a point to learn something new every day, whether it’s a PostgreSQL feature, an AI concept, or a new life skill. I dedicate time to reading, mindfulness, and fitness to stay focused and energized, and spending moments with my kid keeps me grounded and inspired.
Message to the Community:
If I could share one thought with the PostgreSQL community, especially with women — it would be this: don’t be scared to dive into the unknown. Every step outside your comfort zone opens up a new world of learning and confidence. PostgreSQL, like any technology, rewards curiosity and persistence. Keep exploring, keep experimenting, and remember — your ideas and contributions can inspire many others to take that first step too.
Talk Title: PostgresML: Revolutionizing Machine Learning with SQL
In today’s data-driven world, organizations often struggle with complex machine learning infrastructures and data movement challenges. This talk introduces PostgresML, a game-changing PostgreSQL extension that brings machine learning capabilities directly into your database. We’ll explore how PostgresML enables developers and data teams to perform sophisticated ML operations using familiar SQL commands, eliminating the need for separate ML systems. Through live demonstrations, we’ll showcase practical implementations of model training, real-time predictions, and GPU acceleration features. Whether you’re a database engineer, ML practitioner, or technical lead, you’ll learn how to leverage PostgresML to simplify your ML pipeline, enhance security, and accelerate deployment. Join us to discover how this innovative tool is bridging the gap between traditional database operations and modern machine learning workflows.
Talk Title: Developers are decision-makers now. DevRel gets you there faster
DevRel as a role has existed since the 1990s, yet it remains one of the least understood roles in tech. Whether due to changing definitions, role titles, or evolving industries, DevRel has transformed significantly over the past few years—yet it continues to shape the devtool landscape. Since 2023, we’ve seen explosive AI growth alongside a surge in tech companies and technical talent. But who reaches these developers? Developers distrust traditional marketing. Who builds the samples, docs, tutorials, and SDKs they rely on? DevRel has become more critical than ever, especially as developers increasingly become decision-makers. In this talk, we’ll explore what DevRel is, how it drives impact, and how you can build an effective DevRel program.
Talk Title: DPDPA(Digital Personal Data Protection Act) Unleashed – Why It Matters for Women in Data
India’s Digital Personal Data Protection Act (DPDPA) is reshaping how organisations collect, store and use personal data, with a phased, 18‑month rollout. This presentation explores what’s in policy and law, then dives into what it unlocks for careers in data, security and consulting—especially for women. As data architect ,designing database architectures, will try connect legal constructs (Data Principals, Fiduciaries, Consent Managers, the Board) to real-world data and database practices, and show how DPDPA can be a powerful career accelerator, not just a compliance requirement.
Talk Title: Where Technology Meets Customer Needs: Lessons from a Newbie Solutions Engineer
When I stepped into the world of open-source databases as a Solutions Engineer, I expected to feel overwhelmed, but I found a role that made surprising sense. In this talk, I’ll share my journey navigating PostgreSQL with the help of modern cloud platforms like Aiven and DigitalOcean, tuning tools like DBtune, and migration partners like Hexacluster. This isn’t a deep-dive into internals, it’s a practical, beginner-friendly session to reducing the friction of managing PostgreSQL in real-world environments. Along the way, I’ll highlight the often-overlooked role of a Solutions Engineer: the human bridge between customer needs and engineering solutions. If you’re a student, a DBA, a DevOps engineer or just Postgres-curious, you’ll walk away with not only tools to explore, but also a career path to consider.

Talk Title: Architecting Ethical and Responsible AI with PostgreSQL 18
Have you ever developed an Agentic AI application using an agentic framework such as langGraph and pgai extension and noticed you don’t get good results during testing or the results are biased towards a demographic. You don’t know what to do. Organizations developing Agentic AI applications using an agentic framework such as LangGraph and pgai extension often encounter issues during implementation and testing, including suboptimal performance or bias in results such as demographic bias. Identifying the root causes of these issues can be difficult without proper tools and methodologies. This session addresses these challenges by introducing Responsible AI interpretability and explainability techniques. Participants will learn how to understand and trace the model’s decision-making process, enabling them to identify why specific results are generated. These capabilities are essential for meeting compliance requirements in regulated sectors, including banking and insurance. Attendees will gain practical knowledge on building Agentic AI applications that incorporate Responsible AI principles, ensuring transparent, accountable, and fair outcomes.
Rumi ![]()
Talk Title: New features of PostgreSQL 18
PostgreSQL 18 continues the PostgreSQL project’s long-standing focus on performance, scalability, reliability, and developer productivity, building incrementally on the improvements delivered in PostgreSQL 15–17.
Rather than introducing disruptive changes, PostgreSQL 18 is expected to emphasize refinement and maturity across core subsystems such as query execution, indexing, concurrency, replication, and observability, making PostgreSQL even more suitable for enterprise-scale and cloud-native workloads.
Talk Title: Platform Engineering Unpacked: Architecture, Evolution, and Hard-Won Lessons
The way engineering teams build and deliver software has changed dramatically. We’ve moved from manual server setups to automated pipelines, from ticket-based operations to self-service workflows, and from siloed teams to platform-driven organisations. This shift gave rise to Platform Engineering, a discipline focused on creating the internal systems, golden paths, and tooling that empower developers to move faster with less friction.
In this session, I’ll walk through the evolution that brought us here and why Platform Engineering has become a strategic priority across industries. I’ll share the architecture patterns that define successful platforms, how self-service emerges as a core capability, and the practical dos and don’ts learned from building real-world internal platforms.
Attendees will gain a clear understanding of:
Why DevOps wasn’t enough, and what Platform Engineering solves
The natural evolution from scripts → automation → abstractions → platforms
What makes a good platform (and what absolutely doesn’t)
How to design developer-centered systems and golden paths
My firsthand lessons from enabling engineering teams at scale
This talk gives a foundational, experience-driven view of what Platform Engineering really means today and how teams can start their journey the right way.
Our idea explores the implementation of AI-driven query optimization in PostgreSQL, addressing the limitations of traditional optimization methods in handling modern database complexities. We present an innovative approach using reinforcement learning for automated index selection and query plan optimization. Our system leverages PostgreSQL’s pg_stat_statements for collecting query metrics and employs HypoPG for index simulation, while a neural network model learns optimal indexing strategies from historical query patterns. Through comprehensive testing on various workload scenarios, we will validate the model’s ability to adapt to dynamic query patterns and complex analytical workloads. The research also examines the scalability challenges and practical considerations of implementing AI optimization in production environments.
Our findings establish a foundation for future developments in self-tuning databases while offering immediate practical benefits for PostgreSQL deployments. This work contributes to the broader evolution of database management systems, highlighting the potential of AI in creating more efficient and adaptive query optimization solutions.
This talk provides an introductory overview of Artificial Intelligence (AI) and Machine Learning (ML), exploring key concepts and their application in building intelligent systems. It will highlight the essential AI/ML techniques, such as supervised and unsupervised learning, and discuss practical use cases in modern industries. The session also focuses on how PostgreSQL, with its powerful extensions like PostgresML, TimescaleDB, and PostGIS, supports the development of AI-powered applications. By leveraging PostgreSQL’s ability to handle complex datasets and integrate machine learning models, participants will learn how to build scalable, intelligent solutions directly within the database environment.
Success is a multiplier of Action, External Factors and Destiny.
Out of these three, the only controllable aspect is our action. Again, action is the result of our EQ, IQ, SQ, and WQ (Willingness Quotient) together.
We all want to be successful and keep trying to motivate ourselves with external factors. We read inspirational books, listen to great personalities, and whenever possible upgrade ourselves with more knowledge and the list goes on.
Indeed these are excellent motivators, but in this process, we forget the most important source of energy, YOU!
We read other stories to feel inspired, thinking “I am not enough!”
But, the day we start accepting ourselves, introspect, understand, and align our life purpose with our routine, we find the internal POWER. This is a continuous source of motivation and energy which we need at down moments. When we feel, lonely, stuck and seek help, our inner voice is the greatest companion.
But, how many times do we consciously think about our “Subconscious”?
“Journey to Self” is our structured coaching program where we take back focus from the outside and delve deep inside to find our inner strength. Focusing on self-acceptance and personal growth
I believe everyone has POWER within them!
Let’s be the POWERHOUSE!
Human, AI, and Personalized User Experience for DB Observability: A Composable Approach
Database users across various technical levels are frequently frustrated by the time-consuming and inefficient process of identifying the root causes of issues. This process often involves navigating multiple systems or dashboards, leading to delays in finding solutions and potential downstream impacts on operations.
The challenge is compounded by the varying levels of expertise among users. It is essential to strike the right balance between specialized and generalized experiences. Oversimplification can result in the loss of critical information, while an overwhelming amount of data can alienate certain users.
Developers and designers are constantly navigating these trade-offs to deliver optimal user experiences. The integration of AI introduces an additional layer of complexity. While AI can provide personalized experiences within databases, it is crucial to maintain user trust and transparency in the process.
The concept of personalized composable observability offers a potential solution. By combining the strengths of human expertise, information balance, and AI-driven personalization, we can create intuitive and user-friendly experiences. This approach allows users to tailor their observability tools and workflows to their specific needs and preferences.