Serverless Made Easy: An Introduction to Google Cloud Functions With a Use Case

Earlier, developers followed conventional server-based approaches to build & deploy the applications. However, these traditional approaches do not guarantee scalability, continuous uptime, or complete availability of applications across diverse geographical areas. To avoid all these conflicts, cloud functions were introduced. it is one of the way to build and deploy our applications in cloud computing. This allows for the development and deployment of applications in a cloud setting, enabling developers to concentrate on implementing business logic and deploying applications without the need to oversee server management.

Cloud Function

In the GCP (Google Cloud Platform), cloud functions are introduced as FAAS (function as a service). The cloud function is a serverless cloud computing service. which allows us to write and deploy our application without managing any servers and it will execute our code when an event occurs in the cloud services like cloud storage, pub/sub, or cloud fire store database. We can write and run the cloud functions using different run times like Java, .Net, Node.js, PHP, Python, Ruby, etc.

Key Features Of Cloud Function in GCP

1. Pay-As-You-Go: We are charged solely for the actual time our function is executing, with no cost when the function is inactive. Cloud Functions will dynamically begin and end execution as needed in response to events.
2. Infrastructure Maintenance: We can run code in the cloud with no servers or containers to manage our application as scalable.
3. Integration: It can be easily integrated with other GCP services. This makes it easy to add Serverless computing like like cloud storage, pub/sub, cloud fire store
4. Automation: It can simplify the process of managing applications. For example, businesses can automate tasks such as deployment, scaling, and monitoring.

Example Use Case

When a specific file (eg: item.csv) becomes available in a Google Cloud Storage (GCS) bucket, then the cloud function should automatically trigger and invoke an external API without requiring any manual intervention.

Steps to Create a Cloud Function

i) Create a new project using the Google Cloud Console Ex: cloud-storage-function-trigger

ii) Search a keyword as “cloud function” in the GCP console search bar and open the populated cloud function link then it will redirect to the cloud function page, then Enable the Cloud Functions API for the selected project.

iii) Grant access by going to the IAM & Admin service and adding the “Cloud Function Developer” role to the specific service account. so that it allows us to create, update, and delete a cloud function.

iv) Create a cloud storage bucket: go to cloud storage service,  click on “create” specify the bucket name and then create a resource path (location of the file). for ex: GCS(Google Cloud storage) path (product-service-store/export/full/outbound) here product-service-store is the bucket name /export/full/outbound is the path of the resource and also create another cloud storage bucket ex: “source-code“. in this bucket, we will upload the maven project zip file.

v) Create a Cloud Function by navigating to the Cloud Functions page in the Google Cloud Console. Click on “Create Function” and fill out details like function name, region, and trigger type. based on our requirement, we have to select a trigger type like a cloud storage trigger.

vi) Cloud Storage Event Type: in the Event Type Section we have to choose the Event type. which indicates, that when a specific cloud storage event occurs in the GCS path (ex: finalize/create) cloud then our cloud function will be fired without any manual intervention.

vii) Write Function Code: For Java Runtime, create a simple maven project and define a cloud storage background function by implementing a BackGroundFunction interface & overriding the accept() method & defining our logic. automatically our cloud function will be executed when a specific event occurs in the gcs bucket. below is the sample code:

This GCSEvent class is used to read metadata information from the Google Cloud Storage (GCS) bucket at runtime.

pom.xml:
viii) Create a zip file for a maven project, make sure the zip file contains only the src directory and pom.xml file. no other files or directories are allowed. As per our use case, we have to specify the entry point (entry class for our application) Ex: “com.demo.cloud.storage.trigger.SampleFunction”, choose the “ZIP from Cloud Storage” option & select the specific GCS bucket where our source code zip file is present and click on the “Deploy” button. Finally, the application will be built and deployed & cloud function will be created.

Testing

To test the cloud function, we should upload a file into the provided Google Cloud Storage (GCS) path. The cloud function will be triggered and its logic will be executed automatically. Detailed logs can be viewed in the Google Cloud Console under the “Logging” section, providing insight into the execution of the cloud function, including any encountered errors.

Conclusion

By leveraging serverless Google Cloud Functions, developers can build responsive, scalable, and cost-effective applications that meet modern demands. This serverless approach not only accelerates development but also offers the flexibility required to adapt to ever-changing business needs.

About the author

Jithendra Ganthakoru

Add comment

Welcome to Miracle's Blog

Our blog is a great stop for people who are looking for enterprise solutions with technologies and services that we provide. Over the years Miracle has prided itself for our continuous efforts to help our customers adopt the latest technology. This blog is a diary of our stories, knowledge and thoughts on the future of digital organizations.


For contacting Miracle’s Blog Team for becoming an author, requesting content (or) anything else please feel free to reach out to us at blog@miraclesoft.com.

Who we are?

Miracle Software Systems, a Global Systems Integrator and Minority Owned Business, has been at the cutting edge of technology for over 24 years. Our teams have helped organizations use technology to improve business efficiency, drive new business models and optimize overall IT.