GCP pipeline for min service cost and input data vary?
Which is the pipeline for json messages to process from PubSub to BigQuery to use for minimum service cost, and input data volume with variable size and minimal manual intervention. I believe Dataflow is the service to go with min service cost, and with its default autoscaling feature for variable size of input data volume - throughput based on few Online documentation. Please help. I also saw an option for Dataproc with diagnose command, but i don't think that diagnose is used for this purpose.