I’m Minaz, a senior solution architect with a focus on database technologies, cloud solutions, and digital business transformation. My background spans over several years of working with various database systems, including PostgreSQL, where I’ve helped organizations migrate, optimize, and scale their databases. My work revolves around mentoring teams, driving innovation, and creating meaningful business impacts using PostgreSQL and other technologies.
In my current role, I focus on harnessing PostgreSQL’s powerful AI/ML features for vector databases, and performance optimization. I’m passionate about empowering others, especially women in tech, to grow in their careers and make a lasting impact in the tech world.
I began my journey with PostgreSQL during a pivotal project that involved migrating legacy databases to open-source alternatives like PostgreSQL. This wasn’t just a straightforward migration—it was a heterogeneous migration, shifting from proprietary databases like Oracle to PostgreSQL.
The process posed significant challenges due to differences in data types, schema structures, stored procedures, and functions that had to be meticulously addressed.
What truly captured my interest in PostgreSQL was its extensibility. As I worked through these challenges, I discovered the immense power of PostgreSQL’s extensions, such as PostGIS for spatial data, pg_stat_statements for performance analysis, and pg_repack for table maintenance. These tools revolutionised how we approached database management and optimization, allowing us to create highly efficient and tailored solutions.
A particularly meaningful project in my PostgreSQL career was a complex migration from Oracle to PostgreSQL for a client looking to modernize their database infrastructure while reducing costs.
What made this project so meaningful was not just the technical success but the larger impact it had on the client’s business. The migration helped them achieve significant cost savings, increased operational flexibility, and expanded their global digital footprint. This project was close to my heart because it highlighted how PostgreSQL could serve as a powerful tool in modernization efforts, and it showcased the importance of strategic planning and execution in achieving successful outcomes.
One of my most recent contributions has been leveraging PostgreSQL as a next-generation AI database, capable of supporting AI/ML workloads and functioning as a vector database for advanced applications. The extensibility of PostgreSQL allowed us to harness powerful extensions for machine learning and vectorization, driving innovation in our AI-driven projects.
By integrating PostgreSQL with PL/Python, we embedded machine learning models directly within the database, utilizing libraries like TensorFlow and scikit-learn. This enabled us to run ML models natively, facilitating real-time data analysis and predictions. We implemented AI solutions for tasks like anomaly detection and predictive maintenance, all within PostgreSQL’s robust environment.
In addition to AI and ML workloads, we demonstrated during an internal hackathon how PostgreSQL can play a key role in achieving carbon-free data center solutions.
Yes, I’ve faced several challenges in my work with PostgreSQL, but each one provided valuable learning opportunities. Some of the key challenges and how I overcame them include:
1. Performance Optimization in Large-Scale Environments Challenge: When working with large datasets or high-concurrency workloads, we encountered performance bottlenecks, especially with complex queries and heavy indexing.
Solution: To overcome this, we focused on query optimization by leveraging PostgreSQL’s EXPLAIN and ANALYZE tools to analyze execution plans. We also implemented indexing strategies and fine-tuned vacuuming and auto-vacuum settings to avoid table bloat, improving overall database performance. By combining these strategies, we significantly enhanced the speed and responsiveness of PostgreSQL under heavy loads.
2. Vector Database Implementation:
Challenge: Using PostgreSQL as a vector database for high-dimensional data (such as for NLP or recommendation systems) required handling large volumes of data and performing complex vector searches efficiently.
Solution: We leveraged the pgvector extension for storing and querying vectors efficiently. To overcome the challenges of high-dimensional search performance, we implemented nearest-neighbour search using vector indexing and cosine similarity. Additionally, careful attention was given to managing the indexing process for large datasets, which helped us maintain both the performance of the vector search and the overall responsiveness of the database.
As a Solution Architect my primary focus is on transforming the teams I work with. I strive to create a significant business impact that the organization can be proud of, while also fostering an environment where others can learn and grow.
By delivering PostgreSQL training to my team, I equipped them with the skills to leverage the latest technology, foster creative problem-solving, and deliver innovative solutions to our clients..
In my experience within the PostgreSQL ecosystem, I have had the opportunity to mentor and support women as they navigate key moments in their careers, especially those re-entering the workforce after a break, maternity leave or transitioning into more technical roles. I’ve found that these moments can be particularly challenging, and providing the right guidance and creating a supportive environment are critical for their success.
By offering tailored advice and sharing resources, I’ve helped women build confidence in their skills and overcome obstacles that can arise during these transitions. I prioritize fostering an inclusive and collaborative atmosphere where individuals feel empowered to learn and grow.
Seeing these women thrive and make meaningful contributions to the PostgreSQL community has been incredibly fulfilling. Their success not only benefits their personal development but also strengthens the ecosystem as a whole.
For women starting their careers in technology, especially in database management and PostgreSQL, I would offer the following advice:
Don’t Be Afraid to Experiment, be Confident and Assertive, don’t hesitate to ask questions, share your ideas, and advocate for yourself. Confidence is key in advancing your career.
By staying curious, confident, and proactive, you’ll be able to carve out a successful and fulfilling career in database management, PostgreSQL, and the broader tech industry. You have the power to make an impact, and the possibilities are endless.
1. PostgreSQL: Up and Running” by Regina Obe & Leo Hsu
2. PostgreSQL 13 Cookbook” by O’Reilly
3. Udemy: “The Complete PostgreSQL Bootcamp
4. PostgreSQL Reddit Community (/r/PostgreSQL)
5. The PostgreSQL forums (https://www.postgresql.org/community/) are another way to connect with the PostgreSQL community, share knowledge, and get help from experienced users and developers.
As PostgreSQL continues to evolve, there are several exciting developments on the horizon that I’m particularly enthusiastic about:
1. Integration with AI and ML Workloads
2. AI-Powered Vector Search
3. Data Encryption Enhancements: As data security becomes increasingly important, PostgreSQL is continuing to improve its encryption capabilities, including both at-rest and in-transit
4. Big Data Integrations: PostgreSQL’s evolving ability to handle larger datasets, especially in the context of data lakes, is something to watch. There is growing interest in using PostgreSQL as a part of a data lake solution where data from multiple sources, both structured and unstructured, can be analyzed with a unified query engine.
I am looking forward to collaborating further with the PostgreSQL community by sharing knowledge, mentoring others, or driving initiatives that can make a tangible impact.
Being part of the PostgreSQL community means being part of a passionate and supportive ecosystem that thrives on collaboration, knowledge-sharing, and continuous improvement.
For me, it’s not just about working with a powerful and flexible database; it’s about connecting with like-minded professionals who are equally dedicated to pushing the boundaries of what’s possible with PostgreSQL.
Balancing professional and personal life requires setting clear boundaries and prioritizing both continuous learning and personal well-being.
I stay updated on industry trends by integrating learning into my routine, while also making time to disconnect and recharge. This balance helps me remain effective and energized in my work, ensuring long-term success and personal fulfilment.
| To the PostgreSQL community, especially to the women who are part of it or aspiring to join: your voice, perspective, and contributions are invaluable. Don’t be discouraged by challenges, every step you take is a step towards growth and innovation. Lean on the community, share your experiences, and remember that we are all here to support each other. You belong here, and your unique insights will help shape the future of PostgreSQL and the tech industry as a whole. Keep learning, stay curious, and never underestimate the power of your contributions. |
Talk Title: PostgresML: Revolutionizing Machine Learning with SQL
In today’s data-driven world, organizations often struggle with complex machine learning infrastructures and data movement challenges. This talk introduces PostgresML, a game-changing PostgreSQL extension that brings machine learning capabilities directly into your database. We’ll explore how PostgresML enables developers and data teams to perform sophisticated ML operations using familiar SQL commands, eliminating the need for separate ML systems. Through live demonstrations, we’ll showcase practical implementations of model training, real-time predictions, and GPU acceleration features. Whether you’re a database engineer, ML practitioner, or technical lead, you’ll learn how to leverage PostgresML to simplify your ML pipeline, enhance security, and accelerate deployment. Join us to discover how this innovative tool is bridging the gap between traditional database operations and modern machine learning workflows.
Talk Title: Developers are decision-makers now. DevRel gets you there faster
DevRel as a role has existed since the 1990s, yet it remains one of the least understood roles in tech. Whether due to changing definitions, role titles, or evolving industries, DevRel has transformed significantly over the past few years—yet it continues to shape the devtool landscape. Since 2023, we’ve seen explosive AI growth alongside a surge in tech companies and technical talent. But who reaches these developers? Developers distrust traditional marketing. Who builds the samples, docs, tutorials, and SDKs they rely on? DevRel has become more critical than ever, especially as developers increasingly become decision-makers. In this talk, we’ll explore what DevRel is, how it drives impact, and how you can build an effective DevRel program.
Talk Title: DPDPA(Digital Personal Data Protection Act) Unleashed – Why It Matters for Women in Data
India’s Digital Personal Data Protection Act (DPDPA) is reshaping how organisations collect, store and use personal data, with a phased, 18‑month rollout. This presentation explores what’s in policy and law, then dives into what it unlocks for careers in data, security and consulting—especially for women. As data architect ,designing database architectures, will try connect legal constructs (Data Principals, Fiduciaries, Consent Managers, the Board) to real-world data and database practices, and show how DPDPA can be a powerful career accelerator, not just a compliance requirement.
Talk Title: Where Technology Meets Customer Needs: Lessons from a Newbie Solutions Engineer
When I stepped into the world of open-source databases as a Solutions Engineer, I expected to feel overwhelmed, but I found a role that made surprising sense. In this talk, I’ll share my journey navigating PostgreSQL with the help of modern cloud platforms like Aiven and DigitalOcean, tuning tools like DBtune, and migration partners like Hexacluster. This isn’t a deep-dive into internals, it’s a practical, beginner-friendly session to reducing the friction of managing PostgreSQL in real-world environments. Along the way, I’ll highlight the often-overlooked role of a Solutions Engineer: the human bridge between customer needs and engineering solutions. If you’re a student, a DBA, a DevOps engineer or just Postgres-curious, you’ll walk away with not only tools to explore, but also a career path to consider.

Talk Title: Architecting Ethical and Responsible AI with PostgreSQL 18
Have you ever developed an Agentic AI application using an agentic framework such as langGraph and pgai extension and noticed you don’t get good results during testing or the results are biased towards a demographic. You don’t know what to do. Organizations developing Agentic AI applications using an agentic framework such as LangGraph and pgai extension often encounter issues during implementation and testing, including suboptimal performance or bias in results such as demographic bias. Identifying the root causes of these issues can be difficult without proper tools and methodologies. This session addresses these challenges by introducing Responsible AI interpretability and explainability techniques. Participants will learn how to understand and trace the model’s decision-making process, enabling them to identify why specific results are generated. These capabilities are essential for meeting compliance requirements in regulated sectors, including banking and insurance. Attendees will gain practical knowledge on building Agentic AI applications that incorporate Responsible AI principles, ensuring transparent, accountable, and fair outcomes.
Rumi ![]()
Talk Title: New features of PostgreSQL 18
PostgreSQL 18 continues the PostgreSQL project’s long-standing focus on performance, scalability, reliability, and developer productivity, building incrementally on the improvements delivered in PostgreSQL 15–17.
Rather than introducing disruptive changes, PostgreSQL 18 is expected to emphasize refinement and maturity across core subsystems such as query execution, indexing, concurrency, replication, and observability, making PostgreSQL even more suitable for enterprise-scale and cloud-native workloads.
Talk Title: Platform Engineering Unpacked: Architecture, Evolution, and Hard-Won Lessons
The way engineering teams build and deliver software has changed dramatically. We’ve moved from manual server setups to automated pipelines, from ticket-based operations to self-service workflows, and from siloed teams to platform-driven organisations. This shift gave rise to Platform Engineering, a discipline focused on creating the internal systems, golden paths, and tooling that empower developers to move faster with less friction.
In this session, I’ll walk through the evolution that brought us here and why Platform Engineering has become a strategic priority across industries. I’ll share the architecture patterns that define successful platforms, how self-service emerges as a core capability, and the practical dos and don’ts learned from building real-world internal platforms.
Attendees will gain a clear understanding of:
Why DevOps wasn’t enough, and what Platform Engineering solves
The natural evolution from scripts → automation → abstractions → platforms
What makes a good platform (and what absolutely doesn’t)
How to design developer-centered systems and golden paths
My firsthand lessons from enabling engineering teams at scale
This talk gives a foundational, experience-driven view of what Platform Engineering really means today and how teams can start their journey the right way.
Our idea explores the implementation of AI-driven query optimization in PostgreSQL, addressing the limitations of traditional optimization methods in handling modern database complexities. We present an innovative approach using reinforcement learning for automated index selection and query plan optimization. Our system leverages PostgreSQL’s pg_stat_statements for collecting query metrics and employs HypoPG for index simulation, while a neural network model learns optimal indexing strategies from historical query patterns. Through comprehensive testing on various workload scenarios, we will validate the model’s ability to adapt to dynamic query patterns and complex analytical workloads. The research also examines the scalability challenges and practical considerations of implementing AI optimization in production environments.
Our findings establish a foundation for future developments in self-tuning databases while offering immediate practical benefits for PostgreSQL deployments. This work contributes to the broader evolution of database management systems, highlighting the potential of AI in creating more efficient and adaptive query optimization solutions.
This talk provides an introductory overview of Artificial Intelligence (AI) and Machine Learning (ML), exploring key concepts and their application in building intelligent systems. It will highlight the essential AI/ML techniques, such as supervised and unsupervised learning, and discuss practical use cases in modern industries. The session also focuses on how PostgreSQL, with its powerful extensions like PostgresML, TimescaleDB, and PostGIS, supports the development of AI-powered applications. By leveraging PostgreSQL’s ability to handle complex datasets and integrate machine learning models, participants will learn how to build scalable, intelligent solutions directly within the database environment.
Success is a multiplier of Action, External Factors and Destiny.
Out of these three, the only controllable aspect is our action. Again, action is the result of our EQ, IQ, SQ, and WQ (Willingness Quotient) together.
We all want to be successful and keep trying to motivate ourselves with external factors. We read inspirational books, listen to great personalities, and whenever possible upgrade ourselves with more knowledge and the list goes on.
Indeed these are excellent motivators, but in this process, we forget the most important source of energy, YOU!
We read other stories to feel inspired, thinking “I am not enough!”
But, the day we start accepting ourselves, introspect, understand, and align our life purpose with our routine, we find the internal POWER. This is a continuous source of motivation and energy which we need at down moments. When we feel, lonely, stuck and seek help, our inner voice is the greatest companion.
But, how many times do we consciously think about our “Subconscious”?
“Journey to Self” is our structured coaching program where we take back focus from the outside and delve deep inside to find our inner strength. Focusing on self-acceptance and personal growth
I believe everyone has POWER within them!
Let’s be the POWERHOUSE!
Human, AI, and Personalized User Experience for DB Observability: A Composable Approach
Database users across various technical levels are frequently frustrated by the time-consuming and inefficient process of identifying the root causes of issues. This process often involves navigating multiple systems or dashboards, leading to delays in finding solutions and potential downstream impacts on operations.
The challenge is compounded by the varying levels of expertise among users. It is essential to strike the right balance between specialized and generalized experiences. Oversimplification can result in the loss of critical information, while an overwhelming amount of data can alienate certain users.
Developers and designers are constantly navigating these trade-offs to deliver optimal user experiences. The integration of AI introduces an additional layer of complexity. While AI can provide personalized experiences within databases, it is crucial to maintain user trust and transparency in the process.
The concept of personalized composable observability offers a potential solution. By combining the strengths of human expertise, information balance, and AI-driven personalization, we can create intuitive and user-friendly experiences. This approach allows users to tailor their observability tools and workflows to their specific needs and preferences.