I’ve a Spring Boot Application and I have to develop an API to download a file from GCP bucket. So I have the download path and the name of the bucket, e.g. Before I start writing the code for download, what are the preliminary steps? I read that I need a GOOGLE_APPLICATION_CREDENTIALS environment variable, where can I find it? Once
Tag: google-cloud-storage
How to upload files >32mb via Google Cloud Run
I implemented a Cloud Run process in Java with Spring Boot which consumes file uploads via HTTP. The uploaded files are sometimes over 32 MB in size. I know that 32mb is the fixed single-request limit under Cloud Run. The documentation of Cloud Run mentions two approaches to support the upload of larger files. Resumable upload and XML API multipart
fhir.executeBundle replacing resource id…How to prevent this?
I am using this Java code to upload a resource to a FHIRstore. The resource is as follows But the id i am using(123456) is getting replaced by a hexadecimal number. This does not happen while using fhirstores.import method Is there any way to stop executeBundle method from replacing my id…as i want to use custom id in my resource?
Read a file from google storage in dataproc
I’m tring to migrate a scala spark job from hadoop cluster to GCP, I have this snippest of code that read a file and create an ArrayBuffer[String] This code runs in the cluster and gives me 3025000 chars, I tried to run this code in dataproc: it gives 3175025 chars, I think there is whitespaces added to file contents or
How to trigger Cloud Dataflow pipeline job from Cloud Function in Java?
I have a requirement to trigger the Cloud Dataflow pipeline from Cloud Functions. But the Cloud function must be written in Java. So the Trigger for Cloud Function is Google Cloud Storage’s Finalise/Create Event, i.e., when a file is uploaded in a GCS bucket, the Cloud Function must trigger the Cloud dataflow. When I create a dataflow pipeline (batch) and