News

Wrangling Database Performance with Quest Foglight

Written by Eric Fouarge | Feb 19, 2025 7:24:00 PM

I’m always looking for tools that help my teams optimize performance, control costs, and keep systems running smoothly. One tool that has made a real difference for us is Quest Foglight, which filled a major gap in our database observability tooling.

For a long time, we struggled to find a single solution that could give us clear visibility into what was actually happening across our database servers—whether that was SQL Server running on VMs, Amazon RDS, or PostgreSQL data warehouses. Foglight has resolved much of that challenge for us.

It’s not perfect, but for pure RDBMS performance management, it’s been far more effective than tools like Datadog or New Relic, which tend to shine more at application-level observability than deep database tuning.

👉 You can read my full review on PeerSpot.

What Foglight Has Helped Our Team With

✅ All-in-One Database Monitoring

Foglight supports SQL Server, MySQL, PostgreSQL, and more, providing a centralized view of database health across environments. This aligns closely with how we help clients at Ontrac Solutions manage complex, multi-database ecosystems.

🔗 https://www.quest.com/products/foglight/

✅ Smart Alerts & AI-Driven Insights

Foglight identifies performance issues before they escalate, helping teams reduce downtime and firefighting. This proactive visibility is critical when supporting mission-critical production systems.

✅ Deep Performance Analytics

Foglight pinpoints slow queries, resource bottlenecks, and workload trends—exactly the level of insight needed to drive real optimization rather than guesswork.

This is a core part of how we approach database performance optimization at Ontrac.

✅ Works Across Any Environment

Whether databases are on-prem, cloud, or hybrid, Foglight adapts. That flexibility matters as more organizations run mixed infrastructures rather than fully cloud-native stacks.

✅ Cost & Resource Optimization

Foglight helps teams right-size resources and eliminate waste, keeping cloud costs in check. Quest’s flexible licensing model also makes it easier to align tooling with actual operational needs instead of overpaying upfront.

Managed Cloud Databases Are Not a Magic Performance Bullet

Managed database services from cloud providers make deployment easier, but they often fall short when it comes to true performance tuning.

Cloud platforms typically offer basic monitoring, but not the granular visibility needed to tune queries, optimize workloads, or prevent resource waste. The result is over-provisioning compute and storage—which quietly drives cloud costs higher.

“Cloud databases are great for scalability, but without tuning, they become an expensive black box.”
Pat Helland, Distributed Systems Architect

Services like Amazon RDS and Amazon Aurora abstract much of the underlying infrastructure. That abstraction limits visibility into execution plans, indexing strategies, and resource allocation.

When performance issues arise, the default recommendation is often to scale up—an expensive and unsustainable approach. This is where third-party tools like Foglight help bridge the gap by providing the actionable insights that managed services alone do not.

The Real Impact of Database Performance Tuning

When organizations invest in database performance optimization, they often see 30–50% cost savings on compute and storage.

At Ontrac, we see this consistently across SQL Server and PostgreSQL workloads.

💰 SQL Server on EC2 – Cutting Licensing Costs

SQL Server is licensed per vCPU, so tuning queries and improving efficiency can allow teams to scale down instance sizes and reduce licensing fees.

  • Many organizations reduce SQL Server licensing costs by 25–40%
  • Optimized indexing reduces EBS IOPS costs
  • Better query plans lower overall CPU utilization

🔗 https://learn.microsoft.com/en-us/sql/sql-server/

💰 PostgreSQL on AWS Aurora – Reducing Compute & Storage Costs

Aurora pricing is tied to instance size and IOPS, which means inefficiencies directly inflate spend.

Companies that tune Aurora workloads often achieve:

  • 30–50% lower compute costs
  • 20–30% lower storage costs
  • Reduced cross-AZ replication expenses

🔗 https://aws.amazon.com/rds/aurora/

3 Things You Should Do Right Now to Validate Database Cost Efficiency

These are practices we regularly bring into our database performance management workflows to stay fiscally responsible while keeping systems running smoothly.

“Optimizing a single slow query can save thousands in cloud costs—because performance isn’t just about speed, it’s about cost control.”
Baron Schwartz, Author of High Performance MySQL

1️⃣ Analyze Query Performance & Indexing

Identify slow queries driving CPU usage and optimize indexes to reduce full-table scans. Tools like Foglight help surface these issues early.

2️⃣ Right-Size Database Instances & Storage

Most organizations over-provision “just in case.” Review utilization regularly and scale down where possible—especially with SQL Server on EC2 and Aurora ACUs.

3️⃣ Monitor IOPS & Replication Costs

Poorly optimized queries drive high IOPS. Fixing them reduces costs quickly. Also review cross-AZ replication and backup retention policies, which can quietly inflate spend.

Final Thoughts

If you’re looking at your database fleet and feel frustrated by limited visibility, you’re not alone. Managed services simplify operations—but they don’t replace performance tuning.

Tools like Quest Foglight, combined with a disciplined optimization approach, can dramatically improve visibility, performance, and cost control.

If you want help assessing your database environment or optimizing cloud database spend, connect with Ontrac—this is exactly the type of work we help teams tackle every day.

Hope this helps someone who’s unhappy with where their database visibility stands today.