Monitoring access to Server SQUID

Monitoring On-line for command lineTo monitor any update, on-line, we can count on the parameter "-f" of the command "tail", applying filters in accordance with its interest.

# tail -f /var/log/squid/access.log | awk '{print$3 " " $8 " " $7}' wrochal wrochal wrochal wrochal wrochal wrochal wrochal wrochal

Monitoring with the SARG

With the Sarg it is possible to follow with more details the accesses of the users, sees an example as the Sarg generates reports, is possible to configure the Sarg in some languages, trought the archive of configuration /etc/sarg/sarg.conf, the configuration of the Sarg is simple and easy.

It knows a little about Sarg and theirs creator (More Information).

Installing the Sarg

The SARG can be gotten in the following address: After downloaded, it unpacks using it the command:
# tar -xzvf sarg-1.3-PRE2.tar.gz

After that, in the directory where the program was unpacked, it types:
# ./configure
# make
# make install

By standard, the SARG is installed in the directory /usr/local/sarg. In the past /etc/sarg/ is that we will find the configuration archive sarg.conf.

Configuring the Sarg

I go to cite the main parameters and the archive it is explained

Defining Language

language Portuguese

Titulo's Report

title "Squid User Access Reports"

Directory where it will be generated the reports

output_dir /home/squid/report/

To generate reports based on behalf of user (it requires a Proxy configured with authentication of users).

user_ip no
This option allows to specify the place generated for log theirs Squid
# TAG: access_log file
#access_log /usr/local/squid/logs/access.log
#access_log /var/log/squid/logs/access.log # RedHat Versão
In this option nothing it needs to be modified, therefore the type of access to the site is about the type of report in accordance with.# TAG: report_type type
# report_type topsites users_sites sites_users date_time denied auth_failures site_user_time_date

The following options exist:

Topsites - Sites more visited by passed through connection and bytes.
Sites_users - Sample which the users have access a specific site.
Users_sites - Sample sites had access for a specific user.
Date_time - Bytes utilizados/trafegados per day and hour.
Denied - Sample access attempts the sites forbidden for the ACLs.
Auth_failures - Sample authentication attempts (error in the typing of authentication password) imperfections of an user.

After finished the configuration of the Sarg, is enough to generate the reports and below I go to show some examples of as to use.

For example, I want to send email of the report for date:
sarg -e [email protected] -d 01/01/2003-06/01/2003

Another very cool example that would be for address URL, that in the case would below generate the report alone of the addresses described:
sarg -s,

Configuring the date format
sarg -d [e=Europa -> dd/mm/aa], u=EUA -> mm/dd/aa]

Report for user and IP
sarg -i wrochal

Report for hour
sarg -t [HH, HH:MM, HH:MM:SS]

Report for User
sarg -u wrochal

Now you are enough to create the report of the skill that you desire and much good luck.

Report with exclusion of sites, strings and users

Much people ask as to generate report excluding such site, users and strings. Knows as to use this resource:

exclude.hosts - Here each line will have one domain/URL that it will not be shown in the report. Useful you to place, for examples, addresses of download of the Intranet that pass for the Squid, but do not spend band of Internet none.

It places in the archive sarg.conf: exclude_hosts /etc/sarg/exclude.hosts

exclude.strings - if some line of the archive of log to contain one of strings of this archive (each string for line), this line of log will be ignored of the report. With this you can filter any thing of the report.

It places in the archive sarg.conf: exclude_string /etc/sarg/exclude.strings

exclude.users - the users who will be in this archive (separate for line) will not be enclosed in the report.

It places in the archive sarg.conf: exclude_users /etc/sarg/exclude.users

Monitoring with webalizer

The Webalizer works of different form of the Sarg, creates reports with totallings, emphasizing more the analysis of band, throuput and etc. Is an excellent option to compare the use of the net during different periods.

However for monitoring of access of the users, the Sarg is the best alternative, the Webalizer creates reports based on the services Squid and Apache

Installing webalizer

It makes download:

After download unpacks and installs it:
# ./configure
# make
# make install

Advanced options:

Installing with language definition, it verifies the directory lang which has support.




Principals parameters

What archive of logs contains the necessary data to the generation of the report.

LogFile /var/log/squid/access.log

Which type of service the report will be generated.

LogType squid

Directory where the report will be generated

OutputDir /home/webalizer/


The Calamaris is a software written in Perl that effects the generation of reports detailed of the use of the InterNet using the archives of logs of some servers well proxy, as the NetCache, Inktomi Traffic Server, Oops! proxy server, Novell InterNet Caching System, Compaq Tasksmart, Netscape/iplanet Web Proxy server and are clearly the Squid. The generated reports are well simple in the presentation, however very rich in extracted details of the archives of logs, they can be generated in same format HTML or in text to be sent way email.

The use of this software is very simple, first has that to lower the version most recent of it in, to unpack the archive in the directory of its preference, in ours in case that /usr/local/calamaris, after this already can use it. Below we have necessary a simple example of comando1 for generation of the reports of log of the Squid.
# /usr/local/calamaris/calamaris -a -F html /var/log/squid/access.log >/srv/www/default/html/calamaris/index.html

The command above already is enough for generation of excellent reports of the analysis of logs. In this in case that we use the option - the one that says to the Calamaris to be generated all the reports, - F HTML specifies the format of the report that we want, in the case in HTML, /var/log/squid/access.log is where it is located the archive of log of the Squid and /srv/www/default/html/calamaris/index.html the localization of the generated report, in this case a folder in the tree of the Apache of form that can be analyzed of any station of the net.

We can observe in Figure 1 the types of generated reports, as well as in Figure 2 one of these reports, that in the sample information on solicitations, organized for extension.

Figure 01

Figure 02

The ideal is that one is developed script so that let us can program the execution of the Calamaris for cron, but this will not be treated here given to the fact that this will have in accordance with to be made the necessities of the each one, as well as proper software already brings some examples of as to make this.


The Squid-Graph, as well as the Calamaris, is written in Perl, however as the proper name mentions, is a generator of graphs of the use of the server proxy. It if ties to present more synthetic information of the accesses and transferences of data, but nor therefore he leaves of being plus an interesting alternative and can complement the roll of administration tools.

It can be obtained in, being this the last available version at this moment. It is important to remember that it is necessary that is installed module Perl GD, that with certainty must is in the CD's of its favourite distribution.

The installation process is very simple, therefore it is only treated to unpack the archives in the chosen directory, since we do not need to compile nothing. In ours in case that we install in /usr/local/squid-graph /, as it shows the command below.
# tar xzvf squid-graph-3.1.tar.gz -C /usr/local/
# mv squid-graph-3.1 squid-graph
# chmod +x /usr/local/squid-graph/bin/*

The execution of the command for generation of the graphs goes to depend on as and which will have to be generated, however a good way of if to execute this comando2 thus it is generated the graphs of cumulative form is presented below.
#/usr/local/squid-graph/bin/squid-graph -c -n -o=/srv/www/default/html/squid-graph/ --title="Gráfico de uso do proxy" < /var/log/squid/access.log
In the command above we use option - c, of this form we are generating the cumulative graphs, that is, we go to have two graphs for the accesses and transferences TCP, respectively, and more two graphs for the accesses and transferences UDP. The option - n makes with that they are not `` ecoadas'' in the screen the information of the processing of log of the Squid, - o=/srv/www/default/html/squid-graph/ represents the place where the archives will be recorded (HTML and images), - title=``Gráfico of use of proxy'' it personalizes the heading of page HTML, where the graphs are shown, and finally we have the archive of log of the Squid.

Other interesting options exist, as to generate graphs for a determined specific URL or one using one, as we can see below with the use of command 3:
# cat /var/log/squid/access.log | grep "" | /usr/local/squid-graph/bin/squid-graph -c -n / -o=/srv/www/default/html/squid-graph/ --title="Gráfico de uso do proxy"

To generate a graph of the accesses of one determined using one, we would only need to substitute the command above grep shown, for grep `` '', assuming that are the IP of its user. We can still use the same resource for graphs of determined types of archives, using the extension as parameter, for example grep ``.mp3 '' it is a good start. As already it gave pra to perceive, we can combine the use of the Squid-Graph with other commands of the Linux, of form that we can have a infinity of options for its use, moreover exist other interesting options that had not been treated here.

Author: William Rocha - [email protected]
Translation: Tatiana Freitas - [email protected]


Hugo Cisneiros, hugo_arroba_devin_ponto_com_ponto_br -
Book Squid - Configuring Proxy for Linux (Antonio Marcelo)
Webalizer -
Sarg -
Use and Configuration of the SQUID (Part 1 e Part 2) - By Antonio Claudio Sales Pinheiro - [email protected]

Share this page:

Suggested articles

0 Comment(s)

Add comment