We’re thrilled to introduce Lætitia, a trailblazer in the PostgreSQL community and the visionary behind Postgres Women. In this inspiring feature, Lætitia takes us through her incredible journey—from coding her first C-function in PostgreSQL to becoming a passionate advocate for diversity and inclusion in Tech. Discover the key moments that defined her career, the hurdles she overcame, and the lasting impact she’s made on the PostgreSQL ecosystem.
Lætitia’s story is a testament of what passion and perseverance can achieve in the Open-source world.
Journey in PostgreSQL:
In 2007, I entered a company that was a subcontractor for the French map office. They call their product “geoportail”. Of course, PostgreSQL was (and still is) the gold standard for geospatial databases. We had an issue with the development team: in French, there are a lot of different ways to spell the same sounds. (And don’t start me on mute letters in French!) The customer wanted the search engine to find a town even if they misspelled it, as long as it was phonetically correct. There was no extension to do that at the time.
So, I took on the task of writing a C Postgres function to perform a phonetical search in the database. It was a lot of fun! Also, it was simpler because I only had to do that for French.
Can you share a pivotal moment or project in your PostgreSQL career that has been particularly meaningful to you?
I had a child in December 2009. When I went back to work in March 2009, after the maternity leave, my team didn’t need me anymore. They had worked without me for 3 months; they could keep working without me. So, my days at work were quite boring.
Then my boss entered and said, “I know no one will be interested, but I have to tell you that there is a DBA opening in the company.” No previous experience needed, training included.” Less than 3 minutes later, I was in his office saying “I WANT that job.”
So, I began my journey as a DBA. The company needed a PostgreSQl DBA. So, that’s where I started. I got 1 month of training and went back as a DBA! As soon as I returned with that golden “DBA” title, my former colleague stopped interrupting me in meetings and started listening! You have no idea what a title can do for you if you are from an underrepresented group!
Contributions and Achievements:
I have not contributed that much to PostgreSQL code. First, there is the time issue. I try to prioritize my time with my children as they grow up fast and will soon be out of my home. Second, there is the “what should I do?” issue. I got a great answer for that one from Dimitri Fontaine, a great French contributor to PostgreSQL. He told me: “If you don’t want to fight the community for approval of your patch, there’s a simple thing to do. Find a point where Postgres does not follow the SQL standard and make it comply.” That’s the best advice I got!
So, I scanned Markus Winand’s website (https://modern-sql.com/) to find a point where PostgreSQL didn’t comply with the SQL standard. And I found one that seemed simple to write! PostgreSQL lacked support for hyperbolic functions (sinh, cosh, tanh, and their inverses). I added them. I only did it to improve compliance with the SQL standard, but a few months later, a guy thanked me for this! His app needed hyperbolic functions!
Have you faced any challenges in your work with PostgreSQL, and how did you overcome them?
When it comes to tech, there is always a solution or a proof that there is no solution. The difficult part is when you deal with human beings who can be so unreasonable!
My main problem is explaining to people that when they say they want 0 downtime and 0 data loss, it does not mean what they think. I got a good clue when Robert Haas told me: ““Zero data loss” is kind of a fuzzy term. If the Death Star shows up and does to Earth what it did to Alderaan, practically everybody is going to lose data.” (It’s cool that I can find that quote anytime on the company Slack, using the keyword “Alderaan.”)
Now, I try to define scenarios and quantify data loss and downtime in them.
Community Involvement:
My biggest time commitment to the community is being the treasurer for PostgreSQL Europe. Paying the bills, making sure we have insurance for events, preparing budget, talking with accountants, doing taxes… The community needs all those tasks, even though it’s not as fancy or fun as writing code.
I also speak at a lot of events. It takes time to prepare a talk, and speakers at most events are not paid.
I point women out to the right persons when they need help to talk at an event. I try to advertise for free tickets for women attendees. I create brainstorming sessions for women to solve the I-don’t-know-what-to-talk-about problem.
I have several patches to check. But, I must fight the community to explain my view and make them approve them. This is time and energy I don’t have at the moment.
Can you share your experience with mentoring or supporting other women in the PostgreSQL ecosystem?
I haven’t mentored women in the long run. I did small things like encourage to submit a talk, help in finding great topics, help in finding funds to speak to or attend to an event, assist when building a case for the code of conduct comittee, help in rehearsing talks… I discovered great women and tried to help where/when I could, but I have to say most of them didn’t need my help. Remember it’s ok to ask for help (but I know how difficult it is).
Insights and Advice:
Databases are a great topic. There are always new things to learn and old things you will discover. As a database expert, people usually listen to you. They know there are many things they don’t know about this topic. My advice would be to be public about your knowledge: write, speak about it. Build a personal brand. People will then come to you with questions. You’ll respond swiftly when it’s simple. If it’s hard, you will learn something.
Are there any resources (books, courses, forums) you’d recommend to someone looking to deepen their PostgreSQL knowledge?
That’s a question a lot of people ask me. So, it’s worth a blog post soon. There are several things you need to know:
SQL (I recommend https://pgexercises.com/, https://modern-sql.com/, and https://jakewheat.github.io/sql-overview/sql-2016-foundation-grammar.html)
Data modeling (If you want more practical resources, this book https://a.co/d/ccwFNrn is great and make sure you check https://wiki.postgresql.org/wiki/Don%27t_Do_This)
PostgreSQL (https://www.postgresql.org/docs/current/index.html and http://www.interdb.jp/pg/)
Performance (https://use-the-index-luke.com and https://a.co/d/hzqzflo)
Looking Forward:
I’m excited about new features like temporal tables. But, I’m worried about the performance cost. I want a big change like the z-heap project aimed for. But, it’s hard to make such a change in a collaborative project like PostgreSQL. The z-heap idea is to change how Postgres implements MVCC (multiversion concurrency control). It aims to remove the need for vacuum. Can you imagine a world with no XID wraparound problem and no bloat?
Do you have any upcoming projects or goals within the PostgreSQL community that you can share?
I have several patches to improve. But, I need to rest a little now. So I try to slow down my community involvement. Thankfully, great women in the community are working to improve diversity. For example, Karen Jex is creating a brand new Diversity Task Force at PostgreSQL Europe.
Personal Reflection:
First, it means that in a lot of cities in this world, I have friends ready to have a drink with me! The Postgres community is very helpful. People will take their personal time to help you solve your issues. It gives faith in humanity.
How do you balance your professional and personal life, especially in a field that is constantly evolving?
People who know me know I’m passionate. If you’re passionate about your work, don’t try to separate it from your personal life. I try to have as much fun as I can, and that’s good enough for me! I often say that if you have time to get bored, it means you’re living wrong.
So, do enjoy life. Don’t try to be perfect. Take the time needed with your family and friends. Remember your children need you, but they also need you to be happy!
Message to the Community:
You have the right to be there. Take that empty seat!
Our idea explores the implementation of AI-driven query optimization in PostgreSQL, addressing the limitations of traditional optimization methods in handling modern database complexities. We present an innovative approach using reinforcement learning for automated index selection and query plan optimization. Our system leverages PostgreSQL’s pg_stat_statements for collecting query metrics and employs HypoPG for index simulation, while a neural network model learns optimal indexing strategies from historical query patterns. Through comprehensive testing on various workload scenarios, we will validate the model’s ability to adapt to dynamic query patterns and complex analytical workloads. The research also examines the scalability challenges and practical considerations of implementing AI optimization in production environments.
Our findings establish a foundation for future developments in self-tuning databases while offering immediate practical benefits for PostgreSQL deployments. This work contributes to the broader evolution of database management systems, highlighting the potential of AI in creating more efficient and adaptive query optimization solutions.
This talk provides an introductory overview of Artificial Intelligence (AI) and Machine Learning (ML), exploring key concepts and their application in building intelligent systems. It will highlight the essential AI/ML techniques, such as supervised and unsupervised learning, and discuss practical use cases in modern industries. The session also focuses on how PostgreSQL, with its powerful extensions like PostgresML, TimescaleDB, and PostGIS, supports the development of AI-powered applications. By leveraging PostgreSQL’s ability to handle complex datasets and integrate machine learning models, participants will learn how to build scalable, intelligent solutions directly within the database environment.
Success is a multiplier of Action, External Factors and Destiny.
Out of these three, the only controllable aspect is our action. Again, action is the result of our EQ, IQ, SQ, and WQ (Willingness Quotient) together.
We all want to be successful and keep trying to motivate ourselves with external factors. We read inspirational books, listen to great personalities, and whenever possible upgrade ourselves with more knowledge and the list goes on.
Indeed these are excellent motivators, but in this process, we forget the most important source of energy, YOU!
We read other stories to feel inspired, thinking “I am not enough!”
But, the day we start accepting ourselves, introspect, understand, and align our life purpose with our routine, we find the internal POWER. This is a continuous source of motivation and energy which we need at down moments. When we feel, lonely, stuck and seek help, our inner voice is the greatest companion.
But, how many times do we consciously think about our “Subconscious”?
“Journey to Self” is our structured coaching program where we take back focus from the outside and delve deep inside to find our inner strength. Focusing on self-acceptance and personal growth
I believe everyone has POWER within them!
Let’s be the POWERHOUSE!
Human, AI, and Personalized User Experience for DB Observability: A Composable Approach
Database users across various technical levels are frequently frustrated by the time-consuming and inefficient process of identifying the root causes of issues. This process often involves navigating multiple systems or dashboards, leading to delays in finding solutions and potential downstream impacts on operations.
The challenge is compounded by the varying levels of expertise among users. It is essential to strike the right balance between specialized and generalized experiences. Oversimplification can result in the loss of critical information, while an overwhelming amount of data can alienate certain users.
Developers and designers are constantly navigating these trade-offs to deliver optimal user experiences. The integration of AI introduces an additional layer of complexity. While AI can provide personalized experiences within databases, it is crucial to maintain user trust and transparency in the process.
The concept of personalized composable observability offers a potential solution. By combining the strengths of human expertise, information balance, and AI-driven personalization, we can create intuitive and user-friendly experiences. This approach allows users to tailor their observability tools and workflows to their specific needs and preferences.