Collect Skyhigh Secure Web Gateway (On-Premises) logs
This document explains how you can ingest Skyhigh Secure Web Gateway (On-Premises) logs to Google SecOps using Google Cloud Storage.
Skyhigh Secure Web Gateway (formerly McAfee Web Gateway) is an on-premises web security solution that provides malware detection, URL filtering, application control, data loss prevention, and HTTPS inspection to protect organizations from web-based threats and enforce acceptable use policies.
Before you begin
Make sure you have the following prerequisites:
- A Google SecOps instance
- Privileged access to the Skyhigh Secure Web Gateway management console
- A GCP project with Cloud Storage API enabled
- A Linux-based collection server with network access to the Skyhigh Secure Web Gateway appliance and outbound access to Google Cloud Storage
- Skyhigh Secure Web Gateway version 7.x or later
Configure Skyhigh Secure Web Gateway access log format
Configure the Log Handler rule set to generate access log entries in JSON format.
- Sign in to the Skyhigh Secure Web Gateway management console.
- Go to Policy > Rule Sets.
- Click Log Handler in the left navigation.
- Expand the Default rule set.
- Select the nested Access Log rule set.
- Click Add > Rule to create a new rule.
- Configure the rule with the following settings:
- Name: Enter
Write access log data for collection - Criteria: Select Always
- Action: Select Continue
- Event: Click Add > Event
- Name: Enter
- In the Add Event dialog:
- Select Set User-Defined.logLine.
- Click Parameters.
- Build the log line value by combining the required properties. Use the User-Defined.logLine property to concatenate the fields into the desired format.
- Click OK.
- Add a second event to write the log line to the access log file:
- Click Add > Event.
- Select File System Logging.
- Click Parameters.
- For the message parameter, select User-Defined.logLine.
- Click OK.
- Click Finish to save the rule.
Click Save Changes at the top of the page.
Configure File System Logging auto-push
Configure the Skyhigh Secure Web Gateway to automatically push access log files to the collection server.
- In the Skyhigh Secure Web Gateway console, go to Policy > Settings > Engines > File System Logging.
- Select the Access Log Configuration settings.
- Expand Settings for Rotation, Pushing, and Deletion.
- In the Auto Pushing section:
- Select the Enable auto pushing checkbox.
- In the Destination field, enter the FTP or HTTP URL of the collection server:
- FTP example:
ftp://COLLECTION_SERVER_IP:21/swg-logs/ - HTTP example:
http://COLLECTION_SERVER_IP:9111/logloader/
- FTP example:
- In the User name field, enter the username for the collection server.
- In the Password field, enter the password for the collection server.
- In the Rotation section, configure the rotation interval based on your log volume:
- For high-volume environments, set rotation to every 5-15 minutes
- For standard environments, set rotation to every 30-60 minutes
Click Save Changes.
Verify the log prefix
- In the Skyhigh Secure Web Gateway console, go to Configuration > Appliances > Syslog > Log Prefix.
- Verify that the prefix is set to mwg.
If the prefix is different, change it to mwg and click Save Changes.
Create Google Cloud Storage bucket
- Go to the Google Cloud Console.
- Select your project or create a new one.
- In the navigation menu, go to Cloud Storage > Buckets.
- Click Create bucket.
Provide the following configuration details:
Setting Value Name your bucket Enter a globally unique name (for example, skyhigh-swg-logs)Location type Choose based on your needs (Region, Dual-region, Multi-region) Location Select the location (for example, us-central1)Storage class Standard (recommended for frequently accessed logs) Access control Uniform (recommended) Protection tools Optional: Enable object versioning or retention policy Click Create.
Configure the collection server to upload logs to GCS
Set up a collection server to receive the log files pushed by the Skyhigh Secure Web Gateway and upload them to the GCS bucket.
Install the gcloud CLI
On the collection server, install the gcloud CLI:
curl https://sdk.cloud.google.com | bash exec -l $SHELLInitialize the gcloud CLI:
gcloud initAuthenticate with a service account or user account:
gcloud auth loginSet the default project:
gcloud config set project PROJECT_ID
Configure an FTP server to receive pushed logs
Install an FTP server on the collection server (for example, vsftpd):
sudo apt-get install vsftpd -yCreate a dedicated directory for the pushed log files:
sudo mkdir -p /var/log/swg-logs sudo chown ftp:ftp /var/log/swg-logsConfigure the FTP server to allow write access to the log directory.
Start the FTP service:
sudo systemctl enable vsftpd sudo systemctl start vsftpd
Create the GCS upload script
Create the upload script:
sudo mkdir -p /opt/swg-gcs-uploaderCreate the file
/opt/swg-gcs-uploader/upload_to_gcs.shwith the following content:#!/bin/bash LOG_DIR="/var/log/swg-logs" GCS_BUCKET="gs://skyhigh-swg-logs/swg-access-logs/" ARCHIVE_DIR="/var/log/swg-logs/archived" mkdir -p "$ARCHIVE_DIR" # Upload new log files to GCS for log_file in "$LOG_DIR"/*.log; do if [ -f "$log_file" ]; then gsutil cp "$log_file" "$GCS_BUCKET" if [ $? -eq 0 ]; then mv "$log_file" "$ARCHIVE_DIR/" echo "$(date): Uploaded and archived $(basename "$log_file")" else echo "$(date): Failed to upload $(basename "$log_file")" fi fi done # Clean up archived files older than 7 days find "$ARCHIVE_DIR" -type f -mtime +7 -deleteMake the script executable:
sudo chmod +x /opt/swg-gcs-uploader/upload_to_gcs.shReplace
skyhigh-swg-logswith the actual name of the GCS bucket.
Schedule the upload script
Open the crontab editor:
sudo crontab -eAdd the following entry to run the upload every 10 minutes:
*/10 * * * * /opt/swg-gcs-uploader/upload_to_gcs.sh >> /var/log/swg-gcs-uploader.log 2>&1Save and exit.
Verify the upload
- Wait for the Skyhigh Secure Web Gateway to push a log file to the collection server.
Verify the log file is present in the log directory:
ls -la /var/log/swg-logs/Run the upload script manually to test:
sudo /opt/swg-gcs-uploader/upload_to_gcs.shVerify the file was uploaded to GCS:
gsutil ls gs://skyhigh-swg-logs/swg-access-logs/
Apply configuration to Central Management cluster
If you are running multiple Skyhigh Secure Web Gateway appliances in a Central Management cluster:
- Repeat the File System Logging auto-push configuration on every appliance in the cluster.
The Log Handler rule set configuration is synchronized automatically across the cluster.
Retrieve the Google SecOps service account
- Go to SIEM Settings > Feeds.
- Click Add New Feed.
- Click Configure a single feed.
- In the Feed name field, enter a name for the feed (for example,
Skyhigh SWG Logs). - Select Google Cloud Storage V2 as the Source type.
- Select McAfee Web Protection as the Log type.
Click Get Service Account. A unique service account email will be displayed, for example:
chronicle-12345678@chronicle-gcp-prod.iam.gserviceaccount.comCopy this email address for use in the next step.
Click Next.
Specify values for the following input parameters:
Storage bucket URL: Enter the GCS bucket URI with the prefix path:
gs://skyhigh-swg-logs/swg-access-logs/Source deletion option: Select the deletion option according to your preference:
- Never: Never deletes any files after transfers (recommended for testing).
- Delete transferred files: Deletes files after successful transfer.
Delete transferred files and empty directories: Deletes files and empty directories after successful transfer.
Maximum File Age: Include files modified in the last number of days (default is 180 days)
Asset namespace: The asset namespace
Ingestion labels: The label to be applied to the events from this feed
Click Next.
Review your new feed configuration in the Finalize screen, and then click Submit.
Grant IAM permissions to the Google SecOps service account
- Go to Cloud Storage > Buckets.
- Click on
skyhigh-swg-logs. - Go to the Permissions tab.
- Click Grant access.
- Provide the following configuration details:
- Add principals: Paste the Google SecOps service account email
- Assign roles: Select Storage Object Viewer
- Click Save.
UDM mapping table
| Log Field | UDM Mapping | Logic |
|---|---|---|
| mediaType | additional.fields | Merged with labels from mediaType, reputation, Ssl_scanned, av_scanned_up, av_scanned_down, rbi, dlp |
| reputation | additional.fields | |
| Ssl_scanned | additional.fields | |
| av_scanned_up | additional.fields | |
| av_scanned_down | additional.fields | |
| rbi | additional.fields | |
| dlp | additional.fields | |
| intermediary_ip1 | intermediary.ip | Merged from intermediary_ip1, intermediary_ip2, client_ip |
| intermediary_ip2 | intermediary.ip | |
| client_ip | intermediary.ip | |
| intermediary_port | intermediary.port | Converted to integer |
| uriScheme | metadata.event_type | Set to "GENERIC_EVENT", overridden to "NETWORK_HTTP" if uriScheme in ["http", "https"] |
| uriScheme | network.application_protocol | Set to "HTTPS" if uriScheme matches "https", "HTTP" if matches "http" |
| http_action | network.http.method | Value copied directly |
| user_agent_comment | network.http.parsed_user_agent | Converted to parsed user agent |
| httpStatusCode | network.http.response_code | Converted to integer |
| user_agent_comment | network.http.user_agent | Value copied directly |
| serverToClientBytes | network.received_bytes | Converted to uinteger |
| clientToServerBytes | network.sent_bytes | Converted to uinteger |
| Filename | principal.file.full_path | Value copied directly |
| source_ip | principal.ip | Value copied directly |
| country | principal.location.country_or_region | Value copied directly |
| process_name | principal.process.command_line | Value copied directly |
| Ssl_client_prot | principal.resource.attribute.labels | Merged with labels from Ssl_client_prot, Ssl_server_prot |
| Ssl_server_prot | principal.resource.attribute.labels | |
| url | principal.url | Value copied directly |
| username | principal.user.userid | Value copied directly |
| blockReason | security_result.action | Set to "ALLOW", overridden to "BLOCK" if blockReason not empty |
| result | security_result.action_details | Value copied directly |
| category | security_result.category_details | Value copied directly |
| virus | security_result.detection_fields | Merged with labels from virus, Location, lastRule, applicationType, Mw_probablility, Discarded_host, domain_fronting_url |
| Location | security_result.detection_fields | |
| lastRule | security_result.detection_fields | |
| applicationType | security_result.detection_fields | |
| Mw_probablility | security_result.detection_fields | |
| Discarded_host | security_result.detection_fields | |
| domain_fronting_url | security_result.detection_fields | |
| blockReason | security_result.summary | Value copied directly |
| requested_path | target.file.full_path | Value copied directly |
| requested_host | target.hostname | Value copied directly |
| destination_ip | target.ip | Value copied directly |
| userID | target.user.userid | Value copied directly |
Need more help? Get answers from Community members and Google SecOps professionals.