How to Restrict Access to Specific Websites using Squid Proxy Server on CentOS 6.2

Managing access to specific websites is a common requirement for many organizations. This can be for a variety of reasons, such as enhancing network security, improving productivity, or complying with regulatory requirements.

The solution to this problem lies in the use of a proxy server, specifically, the Squid Proxy server.

Squid is a highly flexible and widely used proxy server that caches and delivers web content, optimizes bandwidth, improves response times, and restricts access to specific websites. This tutorial will guide you on how to restrict access to specific websites using Squid Proxy server on CentOS 6.2.

By following this tutorial, you will learn how to configure Squid to restrict access to specific websites during certain hours, thereby giving you greater control over your network’s internet usage. This can be particularly useful for businesses that want to ensure their network resources are used effectively.

Step 1: Open the squid.conf configuration file

[root@centos62 ~]# vi /etc/squid/squid.conf

Step 2: Create a web folder under /etc/squid

This is to store any anonymous files such as Bad_Websites.squid.

[root@centos62 ~]# mkdir /etc/squid/web

Step 3: Create Bad_Websites.squid and add the bad websites list

[root@centos62 ~]# vi /etc/squid/web/Bad_Websites.squid

Example Bad website list:

#List in /etc/squid/web/Bad_Websites.squid
www.porn.com
www.badwebsites.com

Step 4: Define surfing_hour group’s name, surfing time and restricted websites file list

#Add this at the bottom of the ACL Section
#
acl surfing_hours time M T W H F 08:00-17:00
acl Bad_Websites dstdomain "/etc/squid/web/Bad_Websites.squid"
#

Step 5: Always restrict access to webhostinggeeks.com network and Bad_Websites

But allow surfing during surfing_hours group’s only if the sites does not in Bad_Websites (other than Bad_Websites).

# Only allow cachemgr access from webhostinggeeks.com
http_access allow webhostinggeeks.com surfing_hours !Bad_Websites
http_access deny Bad_Websites
http_access deny webhostinggeeks.com

Step 6: Restart Squid proxy server to take effect

[root@centos62 ~]# service squid restart
Stopping squid: ................                           [  OK  ]
Starting squid: .                                          [  OK  ]

Full configuration example:

#
# Recommended minimum configuration:
#
acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1

# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 10.0.0.0/8	# RFC1918 possible internal network
acl localnet src 172.16.0.0/12	# RFC1918 possible internal network
acl localnet src 192.168.0.0/16	# RFC1918 possible internal network
acl localnet src fc00::/7       # RFC 4193 local private network range
acl localnet src fe80::/10      # RFC 4291 link-local (directly plugged) machines
acl webhostinggeeks.com src 192.168.1.0/24    # Your internal network

acl SSL_ports port 443
acl Safe_ports port 80		# http
acl Safe_ports port 21		# ftp
acl Safe_ports port 443		# https
acl Safe_ports port 70		# gopher
acl Safe_ports port 210		# wais
acl Safe_ports port 1025-65535	# unregistered ports
acl Safe_ports port 280		# http-mgmt
acl Safe_ports port 488		# gss-http
acl Safe_ports port 591		# filemaker
acl Safe_ports port 777		# multiling http
acl CONNECT method CONNECT

#Add this at the bottom of the ACL Section
#
acl surfing_hours time M T W H F 08:00-17:00
acl Bad_Websites  dstdomain "/etc/squid/web/Bad_Websites.squid"

#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost
http_access allow manager localhost
http_access deny manager

# Only allow cachemgr access from webhostinggeeks.com
http_access allow webhostinggeeks.com surfing_hours !Bad_Websites
http_access deny Bad_Websites
http_access deny webhostinggeeks.com



# Deny requests to certain unsafe ports
http_access deny !Safe_ports

# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports

# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
#http_access deny to_localhost

#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#

# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
#http_access allow localnet
http_access allow localhost

# And finally deny all other access to this proxy
http_access deny all

# Squid normally listens to port 3128
http_port 3128

# We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?

# Uncomment and adjust the following to add a disk cache directory.
#cache_dir ufs /var/spool/squid 100 16 256

# Leave coredumps in the first cache dir
coredump_dir /var/spool/squid

# Add any of your own refresh_pattern entries above these.
refresh_pattern ^ftp:		1440	20%	10080
refresh_pattern ^gopher:	1440	0%	1440
refresh_pattern -i (/cgi-bin/|\?) 0	0%	0
refresh_pattern .		0	20%	4320

Commands Mentioned:

  • vi /etc/squid/squid.conf – Opens the squid.conf configuration file in the vi text editor.
  • mkdir /etc/squid/web – Creates a new directory named ‘web’ under /etc/squid.
  • vi /etc/squid/web/Bad_Websites.squid – Opens the Bad_Websites.squid file in the vi text editor. If the file does not exist, it creates a new one.
  • acl surfing_hours time M T W H F 08:00-17:00 – Defines the ‘surfing_hours’ access control list (ACL) in Squid, setting the allowed surfing time from Monday to Friday, 08:00 to 17:00.
  • acl Bad_Websites dstdomain “/etc/squid/web/Bad_Websites.squid” – Defines the ‘Bad_Websites’ ACL, setting the list of restricted websites as those listed in the Bad_Websites.squid file.
  • http_access allow webhostinggeeks.com surfing_hours !Bad_Websites – Allows access to webhostinggeeks.com during the defined ‘surfing_hours’ as long as the sites are not listed in ‘Bad_Websites’.
  • http_access deny Bad_Websites – Denies access to the websites listed in ‘Bad_Websites’.
  • http_access deny webhostinggeeks.com – Denies access to webhostinggeeks.com.
  • service squid restart – Restarts the Squid proxy server to apply the changes made in the configuration.
See also  How to Install Apache Httpd Web Server on Linux

Conclusion

In this tutorial, we have walked through the process of restricting access to specific websites using Squid Proxy server on CentOS 6.2. By defining specific access control lists (ACLs) and setting up a list of restricted websites, we can effectively control the internet usage within our network. This can be particularly beneficial for organizations that need to manage their network resources effectively or comply with certain regulatory requirements.

Remember, Squid is a flexible and powerful tool that can be configured to meet a wide range of needs. Whether you’re looking to improve network performance, enhance security, or control access to specific websites, Squid offers a solution.

Comments

1 Comment

  • Avatar Sara says:

    Utopia P2P web proxy is my trusted companion for private browsing. It anonymizes my online presence, keeping my true IP address hidden. I can explore the web freely, without worrying about being tracked or targeted by advertisers. It’s like browsing in my own private sanctuary.

Leave a Reply

Your email address will not be published. Required fields are marked *