Welcome!

By registering with us, you'll be able to discuss, share and private message with other members of our community.

SignUp Now!
  • Guest, before posting your code please take these rules into consideration:
    • It is required to use our BBCode feature to display your code. While within the editor click < / > or >_ and place your code within the BB Code prompt. This helps others with finding a solution by making it easier to read and easier to copy.
    • You can also use markdown to share your code. When using markdown your code will be automatically converted to BBCode. For help with markdown check out the markdown guide.
    • Don't share a wall of code. All we want is the problem area, the code related to your issue.


    To learn more about how to use our BBCode feature, please click here.

    Thank you, Code Forum.

What is the best way to scale up a database ?

Ghost

Platinum Coder
It is no secret that scaling up a database to handle millions of rows of data, and support hundreds of thousands of connections can be extremely tricky. However, caching, using multiple database servers, optimizing queries, and storing data properly can eliminate many issues that can lead to a slow or even unstable / crashing database.

Even famous software like XenForo or WordPress can behave horribly when a site reaches extremely high usage and amounts of data.

What do you do to make sure your site / database can work well long term?
 
Imo a relatively easy tweak that gives you a lot of performance is storing your data in a Galera Cluster.

It's especially interesting because it supports "Multi-Master",
so your forum/website/etc doesn't need to be aware of the cluster and can treat it like a single DB on a single server.
 
Imo a relatively easy tweak that gives you a lot of performance is storing your data in a Galera Cluster.

It's especially interesting because it supports "Multi-Master",
so your forum/website/etc doesn't need to be aware of the cluster and can treat it like a single DB on a single server.
I can't recall if that was our exact solution for one of our clients, but our CEO (who is very nifty with databases and servers) set up a pretty unique solution for a client's database. We were actively crawling business data online to gather business names, addresses, phone numbers, and a lot more, so the database quickly filled up. We ended up stopping at around 15 million total rows of business data. Due to his website having a country wide mapping system to connect visitors & his employees with business locations, we had to split the data up a bit. We started by breaking up all of the data into US state tables. So if the user was browsing Kentucky on the interactive map, we loaded data in from the state_kentucky_info table. This was fine ,but as he got more visitors and more data to add to the businesses, we had to go a step further. Our CEO created multiple database servers that were all replicated automatically so we could update / insert data as usual, but have a smart server set up that knew when to offload some of the work. It was pretty nifty and reduced the stress that we had when there was only one single standard database.
 
Back
Top Bottom