allu62 Posted June 1, 2020 Posted June 1, 2020 Hello. Trying to understand what is happening on my site. Last month pages viewed and downloads were less than half of the months before and this month awstats reports mostly less than 5 visitors a day, often 1 or even 0. I wondered if perhaps Tommy was often down or had other problems. Or if Heliohost blocked me, because download volume was to big (> 1GB in March) or because I did some other thing wrong. After all, 0 and 1 visitor per day, that is not normal to happen. Then, I thought that perhaps there is someting wrong with the statistics applications. In fact, the Apache log file has same size as the months before and the bandwidth application in C-Panel reports 800MB (versus 16MB viewed + 140MB not viewed traffic in awstats). Anyone, who can help? Thanks.
wolstech Posted June 1, 2020 Posted June 1, 2020 Did you delete the contents of your ~/tmp folder by chance? Doing that resets your stats. Also I would verify the stats apps turned on (look under metrics editor).
allu62 Posted June 2, 2020 Author Posted June 2, 2020 No, I haven't deleted the /tmp folder content. awstats, analogstats and webalizer are all 3 turned on. Perhaps, the number of visitors is really this dramatically fallen. What could be the reason (in Google Search, all is normal; I publish new programs and articles as before...).
Krydos Posted June 3, 2020 Posted June 3, 2020 I wondered if perhaps Tommy was often down or had other problems.Here's the monthly uptime for the last 10 months: June 2020 100% May 2020 99.53% April 2020 99.65% March 2020 99.77% February 2020 99.77% January 2020 99.77% December 2019 99.87% November 2019 99.94% October 2019 100% September 2019 99.48% May 2020 was down a little due to a ddos attack that lasted a couple days.
allu62 Posted June 9, 2020 Author Posted June 9, 2020 Searching the Internet, I found that - Bad website performance may be in relationship with bad crawlers. Are there crawlers, that you should always block? In robots.txt, in .htaccess? (there actually is lots of access by certain robots). - Having too many pages indexed would be bad for performance. With actually nearly 4000 animals in my database and thus 4000 dynamically generated Perl pages, should I use "noindex"to not index all these pages? - Has anyone experienced something similar, I mean a site with increasing performance and than suddenly just a handfull of visitors per day? Thanks for help...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now