Platform Engineering Team/API Value Stream/As-We-Go-Blog-Notes: Difference between revisions
Content deleted Content added
No edit summary |
|||
Line 45: | Line 45: | ||
* If intended for production |
* If intended for production |
||
** Submit ticket for storage request tagged with "DBA" |
** Submit ticket for storage request tagged with "DBA" |
||
''What if we want a non-relational storage solution?'' |
|||
===Cloud VPS Storage=== |
===Cloud VPS Storage=== |
Revision as of 18:13, 13 September 2021
A space to drop notes about our work along the way. Including our steps and learnings to hopefully turn into a blog post about the API Platform by the end of the project.
Creating a repository
- Using Gerrit, I placed a request for the repository.
- After the repository was created, I cloned it.
- I then cloned the Service-template-node in the Gerrit repository that was created and pushed to master.
Discovering existing APIs
- Mostly Mediawiki/Wikitech searches where multiple lists of apis/services exist
- Codesearch searching under "Wikimedia Services"
- Differentiating between "service" and "api" is ambiguous on a lot of our documentation
- Extension APIs can be considered part of ActionAPI although not available on all wikis, we may need a way to distinguish what wikis they are/are not enabled on.
Configuring CI Pipeline
- Clone Integrations Repository
- Create jobs in the project-pipeline
- Define the jobs in the layout
- Push for review
Testing CI Pipeline Configuration
- Pull recent changes from example-node-api repository
- Add a pipeline/config file to direct the jobs created to the main blubber file
- Make any small change
- Push to repo for review to see if tests run.
Hosting
Development/POC
- Can be either on Toolforge or CloudVPS
- Setup your API as a systemd service if running on CloudVPS
- Should create a separate CloudVPS project for your API
Logging
- We have a "staging" logstash instance for projects under "deployment-prep".
- Logs must be sent to kafka, cannot be sent directly to logstash
- CloudVPS is blocked off from production environment
- Can send logs to production logstash from deployment-prep CloudVPS
- To send logs from deployment-prep to logstash, setup your API as a systemd service. Ensure logs are either getting sent to stdout OR a supported logfile. Follow instructions to add your service to lookup table
- Services under K8s get automatic logging, but it must adhere to common logging schema
Storage
- If API is experimental/prototype:
- CloudVPS relational DB storage
- If intended for production
- Submit ticket for storage request tagged with "DBA"
What if we want a non-relational storage solution?
Cloud VPS Storage
- You can set up postgres, mariadb, mysql db instance easily through horizon interface