Skip to content

Self-Service Gateway

[Team | Cloud Team] [Tech Lead | TBC] [Service Owner | TBC] [Service Manager | TBC] [Product Manager | TBC]

This page gives an overview of the Self-Service Gateway, describing its current status, where and how it's developed and deployed, and who is responsible for maintaining it.

Service Description

This service provides a web application for the purchase and managment of data storage services. The two main classes of storage available are research storage provided by Research Computing Services and institutional storage provided by the Institutional File Storage service.

There is also some documentation on payment methods

Service Status

The Self-Service Gateway is currently live.

Contact

Technical queries and support should be directed to ssgw-admin@uis.cam.ac.uk and will be picked up by a member of the team working on the service. To ensure that you receive a response, always direct requests to ssgw-admin@uis.cam.ac.uk rather than reaching out to team members directly.

Issues discovered in the service or new feature requests should be opened as GitLab issues in the application repository.

Environments

The Self-Service Gateway is currently deployed to the following environments:

Name Main Application URL Django Admin URL Backend API URL
Production https://selfservice.uis.cam.ac.uk/ https://selfservice.uis.cam.ac.uk/admin https://selfservice.uis.cam.ac.uk/api/swagger.json
Staging https://test.selfservice.uis.cam.ac.uk/ https://test.selfservice.uis.cam.ac.uk/admin https://test.selfservice.uis.cam.ac.uk/api/swagger.json
Development https://webapp.devel.ssgw.gcp.uis.cam.ac.uk/ https://webapp.devel.ssgw.gcp.uis.cam.ac.uk/admin https://webapp.devel.ssgw.gcp.uis.cam.ac.uk/api/swagger.json

The GCP console pages for managing the infrastructure of each component of the deployment are:

Name Main Application Hosting Database
Production GCP Cloud Run GCP Cloud SQL (Postgres)
Staging GCP Cloud Run GCP Cloud SQL (Postgres)
Development GCP Cloud Run GCP Cloud SQL (Postgres)

All environments share access to a set of secrets stored in the meta-project Secret Manager.

Notification channel(s) for environments

Environment Display name Email
Production Self-Service Gateway - DevOps Team email Channel cloud@uis.cam.ac.uk
Staging Self-Service Gateway - DevOps Team email Channel cloud@uis.cam.ac.uk

Source code

The source code for the Self-Service Gateway is spread over the following repositories:

Repository Description
Application Server The source code for the main application server
CUFS API Microservice The source code for the micro-service querying CUFS
Infrastructure Deployment The Terraform infrastructure code for deploying the application server to GCP

Technologies used

The following gives an overview of the technologies the Self-Service Gateway is built on.

Category Language Framework
Server Python 3.9 Django 3.2
Client JavaScript/jQuery django-ucamprojectlight 1.1

Operational documentation

The following gives an overview of how the Self-Service Gateway is deployed and maintained.

How and where the Self-Service Gateway is deployed

  • The Database for storage license/project data is a PostgreSQL database hosted by GCP Cloud SQL.
  • The main web application is a Django application hosted by GCP Cloud Run.
  • The CUFS micro-service is also Django application hosted by GCP Cloud Run.
  • The application stores documents (Purchase Orders) in a GCP storage bucket.
  • There are a number of asynchronous processes which rely on a GCP Cloud Tasks queue. The application submits jobs to the queue that callback the application's application's cloud_tasks/ endpoint.
  • There are a number of scheduled processes that are invoked by GCP Cloud Scheduler.

Deploying a new release

The README.md file in the Infrastructure Deployment repository explains how to deploy the Self-Service Gateway.

Monitoring

  • GCP Cloud Monitoring for tracking the health of applications in the environments and sending alert emails when problems are detected.
  • Cloud Logs for tracking individual requests/responses to/from the web application and the synchronisation job application.

Debugging

The README.md files in each of the source code repositories provide information about debugging both local and deployed instances of the applications.

Testing

When developing new features, a manual "happy path" regression testing script is available and should be used to ensure that no existing features have been broken.

Administration

Any operational or administration issues should be raised as an issue in a repository for this purpose.

Service Management

The Team responsible for this service is Cloud Team.

The Tech Lead for this service is TBC.

The Service Owner for this service is TBC.

The Service Manager for this service is TBC.

The Product Manager for this service is TBC.

The following engineers have operational experience with this service and are able to respond to support requests or incidents: