Hi, every day my GCS bucket will create multiple objects in multiple folders in the same bucket. How can I create an alert for multiple folders as a single alert?.
@@techtrapture I'm developing a dashboard that contains the errors and successes of the airflow/composer logs and queries programmed in bigquery, but I haven't found any metrics that compare successes and errors.
Thanks
Good vdo sir
Keep creating vedios
Thanks keep supporting
Hi, every day my GCS bucket will create multiple objects in multiple folders in the same bucket. How can I create an alert for multiple folders as a single alert?.
sorry for this but, Is possible create a sink and metric for avaliable sucess in dataflow and bigquery?
You mean alert on dataflow job success or logs for job success need to be exported in sink?
@@techtrapture I'm developing a dashboard that contains the errors and successes of the airflow/composer logs and queries programmed in bigquery, but I haven't found any metrics that compare successes and errors.
You can create log sink only for composer logs and then create log based alerts or metrics on that logs.
@@techtrapture thanks so much for this and video!!!
Thanks for watching and support ❤️🎉