• If you are still using CentOS 7.9, it's time to convert to Alma 8 with the free centos2alma tool by Plesk or Plesk Migrator. Please let us know your experiences or concerns in this thread:
    CentOS2Alma discussion

Question [Passenger] Log level not working ? Log filled with debug statistics

Dutiko

New Pleskian
Server operating system version
Debian 11.5
Plesk version and microupdate number
18.0.47 u2
Hello,

I'm having an issue with passenger.log. It's filled every 5 sec with that kind of logs, even when no websites was using it:

[ D 2022-10-13 12:07:40.0001 3592350/T1g age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0002 3592350/T1g Ser/Server.h:691 ]: [ServerThr.23] Updating statistics
[ D 2022-10-13 12:07:40.0002 3592350/T3 age/Cor/App/Poo/AnalyticsCollection.cpp:116 ]: Analytics collection time...
[ D 2022-10-13 12:07:40.0003 3592350/T3 age/Cor/App/Poo/AnalyticsCollection.cpp:143 ]: Collecting process metrics
[ D 2022-10-13 12:07:40.0003 3592350/T1s age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0003 3592350/T3 age/Cor/App/Poo/AnalyticsCollection.cpp:151 ]: Collecting system metrics
[ D 2022-10-13 12:07:40.0003 3592350/T1c age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0003 3592350/T1r age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0004 3592350/T1s Ser/Server.h:691 ]: [ServerThr.29] Updating statistics
[ D 2022-10-13 12:07:40.0004 3592350/T1c Ser/Server.h:691 ]: [ServerThr.21] Updating statistics
[ D 2022-10-13 12:07:40.0005 3592350/T1r Ser/Server.h:691 ]: [ServerThr.28] Updating statistics
[ D 2022-10-13 12:07:40.0011 3592350/T3 age/Cor/App/Poo/AnalyticsCollection.cpp:67 ]: Analytics collection done; next analytics collection in 4.999 sec
[ D 2022-10-13 12:07:40.0040 3592350/Ti age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0041 3592350/T11 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0041 3592350/T2c age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0041 3592350/T22 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0041 3592350/Ti Ser/Server.h:691 ]: [ServerThr.6] Updating statistics
[ D 2022-10-13 12:07:40.0041 3592341/T5 Ser/Server.h:691 ]: [WatchdogApiServer] Updating statistics
[ D 2022-10-13 12:07:40.0043 3592350/Tu age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0041 3592350/T28 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0042 3592350/T18 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0041 3592350/T11 Ser/Server.h:691 ]: [ServerThr.15] Updating statistics
[ D 2022-10-13 12:07:40.0041 3592350/Tg age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0042 3592350/T1k age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0042 3592350/T2c Ser/Server.h:691 ]: [ServerThr.39] Updating statistics
[ D 2022-10-13 12:07:40.0042 3592350/T12 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0042 3592350/T22 Ser/Server.h:691 ]: [ServerThr.34] Updating statistics
[ D 2022-10-13 12:07:40.0043 3592350/T26 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T1m age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/Tw age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T2a age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T24 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/Tk age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/Te age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T1w age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T16 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T2e age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/Tt age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T1y age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T1a age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0044 3592350/Tu Ser/Server.h:691 ]: [ServerThr.12] Updating statistics
[ D 2022-10-13 12:07:40.0043 3592350/T1e age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T2g Ser/Server.h:691 ]: [ApiServer] Updating statistics
[ D 2022-10-13 12:07:40.0043 3592350/Tz age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/Tr age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0044 3592350/T28 Ser/Server.h:691 ]: [ServerThr.37] Updating statistics
[ D 2022-10-13 12:07:40.0043 3592350/T1i age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T14 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T1u age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/Tm age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0043 3592350/T21 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0044 3592350/T18 Ser/Server.h:691 ]: [ServerThr.19] Updating statistics
[ D 2022-10-13 12:07:40.0045 3592350/Tg Ser/Server.h:691 ]: [ServerThr.5] Updating statistics
[ D 2022-10-13 12:07:40.0045 3592350/T1k Ser/Server.h:691 ]: [ServerThr.25] Updating statistics
[ D 2022-10-13 12:07:40.0045 3592350/T12 Ser/Server.h:691 ]: [ServerThr.16] Updating statistics
[ D 2022-10-13 12:07:40.0046 3592350/T1o age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0046 3592350/T26 Ser/Server.h:691 ]: [ServerThr.36] Updating statistics
[ D 2022-10-13 12:07:40.0046 3592350/T1m Ser/Server.h:691 ]: [ServerThr.26] Updating statistics
[ D 2022-10-13 12:07:40.0046 3592350/Tw Ser/Server.h:691 ]: [ServerThr.13] Updating statistics
[ D 2022-10-13 12:07:40.0047 3592350/T2a Ser/Server.h:691 ]: [ServerThr.38] Updating statistics
[ D 2022-10-13 12:07:40.0047 3592350/Tp age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0047 3592350/T24 Ser/Server.h:691 ]: [ServerThr.35] Updating statistics
[ D 2022-10-13 12:07:40.0047 3592350/Tk Ser/Server.h:691 ]: [ServerThr.7] Updating statistics
[ D 2022-10-13 12:07:40.0053 3592350/Tp Ser/Server.h:691 ]: [ServerThr.9] Updating statistics
[ D 2022-10-13 12:07:40.0047 3592350/Te Ser/Server.h:691 ]: [ServerThr.4] Updating statistics
[ D 2022-10-13 12:07:40.0048 3592350/T1w Ser/Server.h:691 ]: [ServerThr.31] Updating statistics
[ D 2022-10-13 12:07:40.0048 3592350/T16 Ser/Server.h:691 ]: [ServerThr.18] Updating statistics
[ D 2022-10-13 12:07:40.0048 3592350/T2e Ser/Server.h:691 ]: [ServerThr.40] Updating statistics
[ D 2022-10-13 12:07:40.0049 3592350/Tt Ser/Server.h:691 ]: [ServerThr.11] Updating statistics
[ D 2022-10-13 12:07:40.0049 3592350/Ta age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0049 3592350/Td age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0049 3592350/T1y Ser/Server.h:691 ]: [ServerThr.32] Updating statistics
[ D 2022-10-13 12:07:40.0049 3592350/T1a Ser/Server.h:691 ]: [ServerThr.20] Updating statistics
[ D 2022-10-13 12:07:40.0050 3592350/T1e Ser/Server.h:691 ]: [ServerThr.22] Updating statistics
[ D 2022-10-13 12:07:40.0050 3592350/T8 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2022-10-13 12:07:40.0050 3592350/Tz Ser/Server.h:691 ]: [ServerThr.14] Updating statistics
[ D 2022-10-13 12:07:40.0051 3592350/Tr Ser/Server.h:691 ]: [ServerThr.10] Updating statistics
[ D 2022-10-13 12:07:40.0051 3592350/T1i Ser/Server.h:691 ]: [ServerThr.24] Updating statistics
[ D 2022-10-13 12:07:40.0051 3592350/T14 Ser/Server.h:691 ]: [ServerThr.17] Updating statistics
[ D 2022-10-13 12:07:40.0052 3592350/T1u Ser/Server.h:691 ]: [ServerThr.30] Updating statistics
[ D 2022-10-13 12:07:40.0052 3592350/Tm Ser/Server.h:691 ]: [ServerThr.8] Updating statistics
[ D 2022-10-13 12:07:40.0052 3592350/T21 Ser/Server.h:691 ]: [ServerThr.33] Updating statistics
[ D 2022-10-13 12:07:40.0053 3592350/T1o Ser/Server.h:691 ]: [ServerThr.27] Updating statistics
[ D 2022-10-13 12:07:40.0055 3592350/Ta Ser/Server.h:691 ]: [ServerThr.2] Updating statistics
[ D 2022-10-13 12:07:40.0055 3592350/Td Ser/Server.h:691 ]: [ServerThr.3] Updating statistics
[ D 2022-10-13 12:07:40.0055 3592350/T8 Ser/Server.h:691 ]: [ServerThr.1] Updating statistics

I've tried changing the PassengerLogLevel in /etc/apache2/mods-enabled/passenger.conf (which default to 5 (debug)) to warning level (2). But no improvements, still receiving tons of these logs...
Any idea on how to deal with this issue ?
 
I am not using Plesk, but I am having the same issue occurring using Redmine 4.1.1.stable.

I also tried adjusting the LogLevel setting as follows, and confirmed that I had restarted apache2 -- but, no change. Does anyone have any other ideas? Is there an application out there somewhere that will analyze Passenger settings and give you information and suggestions, etc.?

Screenshot 2023-06-22 030502.jpg

Thanks!
 
Followup: I ran the command "passenger-status -v --show=xml" and it produced the following: Hastebin

Notice line 47 of that file, it stills says that it's using Log Level 5. I'm not sure if there is another way to set/change that setting other than using the "PassengerLogLevel" directive in the apache2 config file ....but, I'll keep looking!
 
Well, I figured out my issue anyway..lol. It turns out that the PassengerLogLevel directive didn't have any effect in the passenger.conf file ..however, when I put the same directive in the apache.conf file, then it worked. The output of passenger-status -v --show=xml also reflected the log level as 1 now.

Moderator -- please feel free to delete all three of these messages if you wish.

Thanks!
 
Hello,

I'm having an issue with passenger.log. It's filled every 5 sec with that kind of logs, even when no websites was using it:



I've tried changing the PassengerLogLevel in /etc/apache2/mods-enabled/passenger.conf (which default to 5 (debug)) to warning level (2). But no improvements, still receiving tons of these logs...
Any idea on how to deal with this issue ?
@Dutiko Did you get this to work? I've updated...
/etc/apache2/mods-enabled/passenger.conf
...then restarted Apache as suggested by @Maarten.

However I am still seeing a constant flow of log entries i don't really need to see due to space issues (my passenger.log was 5GB+).

Code:
[ D 2023-06-23 11:08:30.0044 4055/T14 Ser/Server.h:691 ]: [ApiServer] Updating statistics
[ D 2023-06-23 11:08:30.0044 4055/Ts Ser/Server.h:691 ]: [ServerThr.11] Updating statistics
[ D 2023-06-23 11:08:30.0044 4055/Tl Ser/Server.h:691 ]: [ServerThr.7] Updating statistics
[ D 2023-06-23 11:08:30.0044 4055/Ty age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2023-06-23 11:08:30.0044 4055/Ty Ser/Server.h:691 ]: [ServerThr.14] Updating statistics
[ D 2023-06-23 11:08:30.0045 4055/T8 age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2023-06-23 11:08:30.0045 4055/Tw age/Cor/Con/TurboCaching.h:246 ]: Clearing turbocache
[ D 2023-06-23 11:08:30.0045 4055/T8 Ser/Server.h:691 ]: [ServerThr.1] Updating statistics
[ D 2023-06-23 11:08:30.0045 4055/Tw Ser/Server.h:691 ]: [ServerThr.13] Updating statistics

Does anyone have some more tips or can confirm how to stop the insane log level happening?

Ubuntu 18.04.6 LTS / Plesk Obsidian / Version 18.0.51 Update #1
 
@Organizer
Are you using Apache or Nginx? For Nginx, there is a config file in /etc/nginx/conf.d where you can set the log level:
Code:
# grep log_level /etc/nginx/conf.d/phusion-passenger.conf
passenger_log_level 0;
 
@Organizer
Are you using Apache or Nginx? For Nginx, there is a config file in /etc/nginx/conf.d where you can set the log level:
Code:
# grep log_level /etc/nginx/conf.d/phusion-passenger.conf
passenger_log_level 0;
Thank you @Maarten. ah, I am of course using NGIX should have thought about that, I'm pretty sure it will work now
 
You could post a Uservoice request to set it to level 2 by default. I am sure they have good reasons for their version, but I am not so sure what the outcome of the response could be and how it could help. Maybe it is better to simply request a change to level 2.
 
I just found this Uservoice request:

 
I think the developers forgot to set the log level to a reasonable level. The current level is insane.
 
Back
Top