Jump to content

[Solved] Suspended: amanar17


amanar17

Recommended Posts

 

His mysql database is 3.52 GB.

 

The cPanel disk usage tool says ~543.03MB for mysql when I checked it for his account hence my response. Where is the other 3GB that I'm missing? 

 

 

 

root@johnny [/var/lib/mysql/amanar17_tle]# du -sh
3.4G    .
Maybe he should try exporting the database, dropping it, recreating it, and importing the data back in.

 

 

That is surely not possible, I never had that much data in there. I created half a gigabyte worth of tables and then ran a script to define some views. That's when it got suspended.

 

From what I know of views, they do not take up 3GB on their own.

Link to comment
Share on other sites

We're not sure why this is so large either...Indexes and overhead possibly?

 

Try the suggestion given: export a backup of the database, drop it, then create a new one and reimport the backup, then we'll check to see if the size goes down.

Link to comment
Share on other sites

Thanks very much! There is one small problem however: I am still locked out of phpMyAdmin and still have remote access blocked.

 

I deleted the database again and created it, and I still do not have access. Btw, I had all the data backed up on my hard disk already, so once I have access I can just recreate it.

 

EDIT: I get message

 

dot.gif mysqli_connect(): (HY000/3118): Access denied for user 'amanar17'@'localhost'. Account is locked.

when trying to log in to php.

Edited by amanar17
Link to comment
Share on other sites

I'm glad you had all the data backed up.

 

I tried to backup your database for you several times and it crashed mysql every time. I finally just dropped the database, and it took 20 minutes to even delete it. Whatever you had in there Johnny can't handle it.

 

As soon as I finally got your database deleted Johnny's mysql has been running perfectly smooth since. I looked at the uptime for the last week, and determined that you caused about 41 hours of downtime. Johnny doesn't ever have the greatest uptime, but that's not acceptable even for Johnny. You'll need to find somewhere else to host your 3.5 GB database.

Link to comment
Share on other sites

Fine, but I really do not think the problem is at my end. I had in there TLE datasets https://www.celestrak.com/NORAD/elements/ converted to csv files. They are public time-indexed records of measurements of orbital positions in a compact format. There was around half a gigabyte of those, stored in a few thousand tables with one table per trackable object.

 

Then, I cycled through all the days in the last few years and for each particular day I defined a view that extracted the data for that day from the whole dataset and queried that view as a test. This is where something went wrong. Yes, it is a lot of views - one per day for several years, each involving a search of every table in the database. Still, it should not have caused this much downtime, and certainly not created 3GB extra data, because if one considers the views as temp tables, the total amount of those temp tables should be at most equal to the size of the database (and in reality much much less because my script did not run to the end).

Edited by amanar17
Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...