08-01-2025 11:57 PM
Any mechanism in umbrella to permit a regex URL like block google.com but permit google.com/maps/page
Solved! Go to Solution.
08-02-2025 06:41 AM - edited 08-05-2025 09:13 AM
Hi @manvik,
Judging by your title "Regex URL in Web policy" it can be assumed you are running either with SIG-Essentials or SIG-Advantage licensing, and having looked at your previous posts, it looks like you do have one of those SIG packages. The 'Web policy' pane does not show up unless you have the license anyway.
It is possible to block specific URL paths even if you're only using the DNS policy, but it requires the Intelligent Proxy feature turned on. Allowing specific URL paths is not supported, only blocking.
To directly answer the question of how to do this within the Web policy and allowing/blocking specific URL paths, yes it is possible. You could, for example, configure a rule that blocks bbc.co.uk/news/wales, allows bbc.co.uk/news but blocks bbc.co.uk. If you are doing this it is good practice to ensure that you have HTTPS Inspection turned on in the Web policy and the domain that the URL resides on is NOT added to the Selective Decryption List (though not a requirement in all cases). You will also need to create at least two web policy rules within your ruleset, e.g. one for blocking and another for the allow. For your specific example of google.com and google.com/maps, it is possible, but it wouldn't be as simple as just adding 'google.com/maps' to a destination list as google maps will leverage other domains that are now being blocked due to blocking the 'google.com' domain. So, in this case, you would probably use an application setting to allow 'Google Maps' as an application, but even that may not be enough to get it to work, it depends on the complexity of the site or URL path you are visiting, e.g. if iframes are used, etc. In the policy creation, your allow rule would have a destination list and possibly an application list attached to match on what you want to allow, and the block would be beneath it to block 'google.com'.
So you certainly can achieve the above using a Destination List or a combination of that with Content Categories or Application Settings, you just need to structure your rules appropriately. The first example I used of blocking specific parts of bbc.co.uk could be achieved as per the below image, where each rule within your matched ruleset references a separate destination list to block or allow the specific paths (see the name and rule number order). It is an unrealistic example, but an example of how granular you can get. Destination List entries have implicit wildcards (you do not need to add a wildcard yourself) and that is why three rules are needed. The documentation below should also explain this (sub-sections: Control Access to Custom URLs, and, Wildcards and Destination Lists, within the 'Manage Destination Lists' parent section, are good reads)
These sections (and more importantly, their sub-sections) of the documentation are relevant:
https://docs.umbrella.com/umbrella-user-guide/docs/manage-destination-lists
https://docs.umbrella.com/umbrella-user-guide/docs/manage-web-policies
https://docs.umbrella.com/umbrella-user-guide/docs/manage-content-categories
https://docs.umbrella.com/umbrella-user-guide/docs/application-settings
If anything is unclear or you have any further questions, just ask. Hope that wasn't too confusing.
08-02-2025 03:09 AM
@manvik hello there. Well umbrella works at the dns level, SOOOOO it only sees domain names, not fur ulrs. Meaning that u can block or allow google,com but not specific paths like google.com/maps////......
and my friend, if u need that kind of detailed control, ud need umbrella secure web gateway featuree.. SO SHORT ANSWER IT IS NOPE, NOT WITH BASIC UMBRELLA, BUT YES IF U USING swg.
PEace!
-Enes
08-03-2025 09:41 PM
we're using Umbrella SIG Essentials
08-02-2025 05:13 AM
check some guide lines :
https://docs.umbrella.com/deployment-umbrella/docs/conduct-a-pattern-search
08-02-2025 06:41 AM - edited 08-05-2025 09:13 AM
Hi @manvik,
Judging by your title "Regex URL in Web policy" it can be assumed you are running either with SIG-Essentials or SIG-Advantage licensing, and having looked at your previous posts, it looks like you do have one of those SIG packages. The 'Web policy' pane does not show up unless you have the license anyway.
It is possible to block specific URL paths even if you're only using the DNS policy, but it requires the Intelligent Proxy feature turned on. Allowing specific URL paths is not supported, only blocking.
To directly answer the question of how to do this within the Web policy and allowing/blocking specific URL paths, yes it is possible. You could, for example, configure a rule that blocks bbc.co.uk/news/wales, allows bbc.co.uk/news but blocks bbc.co.uk. If you are doing this it is good practice to ensure that you have HTTPS Inspection turned on in the Web policy and the domain that the URL resides on is NOT added to the Selective Decryption List (though not a requirement in all cases). You will also need to create at least two web policy rules within your ruleset, e.g. one for blocking and another for the allow. For your specific example of google.com and google.com/maps, it is possible, but it wouldn't be as simple as just adding 'google.com/maps' to a destination list as google maps will leverage other domains that are now being blocked due to blocking the 'google.com' domain. So, in this case, you would probably use an application setting to allow 'Google Maps' as an application, but even that may not be enough to get it to work, it depends on the complexity of the site or URL path you are visiting, e.g. if iframes are used, etc. In the policy creation, your allow rule would have a destination list and possibly an application list attached to match on what you want to allow, and the block would be beneath it to block 'google.com'.
So you certainly can achieve the above using a Destination List or a combination of that with Content Categories or Application Settings, you just need to structure your rules appropriately. The first example I used of blocking specific parts of bbc.co.uk could be achieved as per the below image, where each rule within your matched ruleset references a separate destination list to block or allow the specific paths (see the name and rule number order). It is an unrealistic example, but an example of how granular you can get. Destination List entries have implicit wildcards (you do not need to add a wildcard yourself) and that is why three rules are needed. The documentation below should also explain this (sub-sections: Control Access to Custom URLs, and, Wildcards and Destination Lists, within the 'Manage Destination Lists' parent section, are good reads)
These sections (and more importantly, their sub-sections) of the documentation are relevant:
https://docs.umbrella.com/umbrella-user-guide/docs/manage-destination-lists
https://docs.umbrella.com/umbrella-user-guide/docs/manage-web-policies
https://docs.umbrella.com/umbrella-user-guide/docs/manage-content-categories
https://docs.umbrella.com/umbrella-user-guide/docs/application-settings
If anything is unclear or you have any further questions, just ask. Hope that wasn't too confusing.
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide