• Dear Pleskians! The Plesk Forum will be undergoing scheduled maintenance on Monday, 7th of July, at 9:00 AM UTC. The expected maintenance window is 2 hours.
    Thank you in advance for your patience and understanding on the matter.
Resource icon

Instruction How to find the busiest website by reading access log file

Here is a update to the code.

what i did was added easyer see how much traffic a website uses and put them last so its easy at a glance

#!/bin/bash

# Clear any existing queue file
> rxqueue.txt

# Find all access_log files under /var/www/vhosts and add them to the queue file
find /var/www/vhosts/system -name access_*_log > rxqueue.txt

# Temporary file to store sizes and paths
> size_output.txt

# Process each entry in the queue file
while IFS= read -r r; do
# Get the size of each file using the 'stat' command
size=$(stat -c '%s' "$r")

# Convert size to appropriate unit
if [ "$size" -ge 1073741824 ]; then
# Size is 1 GB or more
size_human=$(echo "scale=2; $size/1073741824" | bc)G
elif [ "$size" -ge 1048576 ]; then
# Size is 1 MB or more
size_human=$(echo "scale=2; $size/1048576" | bc)M
elif [ "$size" -ge 1024 ]; then
# Size is 1 KB or more
size_human=$(echo "scale=2; $size/1024" | bc)K
else
# Size is in bytes
size_human="${size}B"
fi

# Output size and file path to temporary file
echo "$size $size_human $r" >> size_output.txt
done < rxqueue.txt

# Sort by size (numeric, ascending) and print with human-readable units and paths
sort -n size_output.txt | awk '{print $2, $3}'

# Clean up temporary files
rm rxqueue.txt size_output.txt

# End of the script
Back
Top