I have been working on a project recently where having detailed network information has been very helpful. I thought that it would be a good idea to share what I have learned recently about enabling VPC flow logs in my Google Cloud environment.
If you have more than one VPC in your environment, you will want to make sure that we enable the flow logs for the right environment. You can list out the available VPCs with the following command:
gcloud compute networks list
NAME SUBNET_MODE BGP_ROUTING_MODE IPV4_RANGE GATEWAY_IPV4
default AUTO REGIONAL
development AUTO REGIONAL
production AUTO REGIONAL
shared CUSTOM REGIONAL
staging AUTO REGIONAL
You cannot enable the flow logs for the entire VPC, so you need to do it per region. You can get a list of the subnets configured by your VPC with this command;
gcloud compute networks subnets list
NAME REGION NETWORK RANGE
default us-west2 default 10.168.0.0/20
default asia-northeast1 default 10.146.0.0/20
default us-west1 default 10.138.0.0/20
default southamerica-east1 default 10.158.0.0/20
default europe-west4 default 10.164.0.0/20
default asia-east1 default 10.140.0.0/20
default europe-north1 default 10.166.0.0/20
default asia-southeast1 default 10.148.0.0/20
default us-east4 default 10.150.0.0/20
default europe-west1 default 10.132.0.0/20
default europe-west2 default 10.154.0.0/20
default europe-west3 default 10.156.0.0/20
default australia-southeast1 default 10.152.0.0/20
default asia-south1 default 10.160.0.0/20
default us-east1 default 10.142.0.0/20
default us-central1 default 10.128.0.0/20
default asia-east2 default 10.170.0.0/20
default northamerica-northeast1 default 10.162.0.0/20
Once you have the regions, you can enable the ones you want by running the following command and changing the region to the one you want to enable flow logs for.
gcloud beta compute networks subnets update default --enable-flow-logs —region=us-east1
Updated [https://www.googleapis.com/compute/beta/projects/austincloudguru-123456/regions/us-east1/subnetworks/default].
It takes a few minutes for your logs to start flowing into logs. You can validate that they are coming by trying the following:
gcloud logging read "resource.type=gce_subnetwork AND logName=projects/austincloudguru-123456/logs/compute.googleapis.com%2Fvpc_flows" --limit 10 --format json
Now you can search the logs in Stackdriver Logging. If you want to do some further analysis, you can create a sink and send them into BigQuery. First you need to create the BigQuery dataset to store your logs.
bq mk acg_flow_logs
Next, you can add a logging sink to send the logs to BigQuery:
gcloud beta logging sinks create acg-flow-logs \
bigquery.googleapis.com/projects/austincloudguru-123456/datasets/acg_flow_logs \
--log-filter=‘resource.type="gce_subnetwork" AND logName="projects/austincloudguru-123456/logs/compute.googleapis.com%2Fvpc_flows"'
With the logs in BigQuery, you can run more in-depth analysis of your traffic. I have written a couple of queries for the specific data I am looking for but they really would not be helpful to share. I have been looking around for a pre-build Data Studio Dashboard to better visualize the data, but I have not found one yet. If you know of any, let me know in the comments.
Comments