Διαχείριση σχολικού εργαστηρίου/WebProxy: Διαφορά μεταξύ των αναθεωρήσεων

Περιεχόμενο που διαγράφηκε Περιεχόμενο που προστέθηκε
Altbreeze (συζήτηση | συνεισφορές)
Altbreeze (συζήτηση | συνεισφορές)
μΧωρίς σύνοψη επεξεργασίας
Γραμμή 81:
 
Αν αλλάξουμε στην acl γραμμή ''src'' σε '''dst''' τότε ορίζουμε την κλάση των αιτήσεων που έχουν στο πεδίο προορισμού τις διευθύνσεις που δίνουμε.
 
==== Κανονικές εκφράσεις ====
 
Regular expressions in Squid are case-sensitive by default. If you want to match both upper or lower-case text, you can prefix the regular expression with a -i. Have a look at the next example, where we use this to match either sex SEX (or even SeX).
 
Using regular expressions allows you to create more flexible access lists. So far you have only been able to filter sites by destination domain, where you have to match the entire domain to deny access to the site. Since regular expressions are used to match text strings, you can use them to match words, partial words or patterns in URLs or domains.
 
The most common use of regex filters in ACL lists is for the creation of far-reaching site filters: if the url or domain contain a set of banned words, access to the site is denied. If you wish to deny access to sites that contain the word sex in the URL, you would add one acl rule, rather than trying to find every site that has adult material on it.
 
The big problem with regex filters is that not all sites that contain the word sex in the URL are pornographic. By denying these sites you are likely to be infringing people's rights, and you should refer to a lawyer for advice on the legality of this.
 
Creating a list of sites that you don't want accessed can be tedious. There are companies that sell adult/unwanted material lists which plug into Squid, but these can be expensive. If you cannot justify the cost, you may be able to locate a free list, though they will not include full coverage.
 
The url_regex acl type is used to match any word in the URL. Here is an example:
 
=== Ρύθμιση wpad ===