Unverified Commit 6d8d616e authored by Ivan Gabriele's avatar Ivan Gabriele Committed by GitHub
Browse files

fix(db): rewrite backup and restore scripts (#1224)

* fix(db): rewrite backup and restore scripts
* refactor: remove useless left db seed info
* fix(scripts): remove useless arg check in restoreSnapshot.js
parent 08d8e292
......@@ -147,8 +147,4 @@ Temporary Items
# Docker shared backups volume:
/backups/*
!/backups/.gitkeep
# Prevent any script to be pushed whithin the production database seeds:
/db/seeds/prod/*
!/db/seeds/prod/.gitkeep
!/backups/README.md
......@@ -14,6 +14,11 @@ help make it even better than it is today!
- [Known Issues](#known-issues)
- [Docker Compose](#docker-compose)
- [Jest Watch](#jest-watch)
- [Common Tasks](#common-tasks)
- [Database backup in production](#database-backup-in-production)
- [Database restore in production](#database-restore-in-production)
- [Database snapshot update in development](#database-snapshot-update-in-development)
- [Database snapshot restore in development](#database-snapshot-restore-in-development)
- [Naming Guidelines](#naming-guidelines)
- [API-related methods](#api-related-methods)
- [React methods](#react-methods)
......@@ -46,22 +51,15 @@ yarn setup
yarn dev
```
The website should now be available at: http://localhost:3100.
The website should now be available at: <http://localhost:3100>.
5 sample users have been generated during setup:
- Administrator:
- Email: `doris@sea.com`<br>
Mot de passe: `Azerty123`
- Regional Administrator:
- Email: `deb@sea.com`<br>
- Email: `doris@sea.com`
Mot de passe: `Azerty123`
- Contributors:
- Email: `nemo@sea.com`<br>
Mot de passe: `Azerty123`
- Email: `astrid@sea.com`<br>
Mot de passe: `Azerty123`
- Email: `marin@sea.com`<br>
- Email: `nemo@sea.com`
Mot de passe: `Azerty123`
### Standalone
......@@ -95,7 +93,6 @@ This repository comes with multiple useful npm scripts (run via `yarn <script>`)
- `db:migrate` Migrate database schema.
- `db:migrate:make`: Create a new database migration file.
- `db:restore`: Restore a database dump.
- `db:seed`: Seed the database via a mix of dummy and real production data.
- `db:snapshot:restore`: Restore the dev database dump.
- `db:snapshot:update`: Update the dev database dump file.
- `dev`: Start a full development instance (including Docker images).
......@@ -103,8 +100,6 @@ This repository comes with multiple useful npm scripts (run via `yarn <script>`)
- `dev:packages`: Sun the packages instance in dev (watch + live-reload) mode.
- `setup`: Setup (or refresh) a ready-to-use dev environment.
- `setup:env`: Reset the dev environment variables (via the `.env` file).
- `setup:full`: Setup (or refresh) a ready-to-use dev environment **with** a new seed.<br>
_This also updates the dev/test database snapshot._
- `start`: Start a full production instance (without Docker images).
- `start:prod`: Run the production build & run script.
......@@ -116,28 +111,20 @@ This repository comes with multiple useful npm scripts (run via `yarn <script>`)
```json
{
"editor.defaultFormatter": "esbenp.prettier-vscode",
"editor.formatOnPaste": false,
"editor.codeActionsOnSave": {
"source.fixAll": true
},
"editor.defaultFormatter": "dbaeumer.vscode-eslint",
"editor.formatOnSave": true,
"editor.rulers": [100],
"eslint.enable": true
}
```
`extensions.json`
```json
{
"recommendations": [
"alexkrechik.cucumberautocomplete",
"dbaeumer.vscode-eslint",
"editorconfig.editorconfig",
"esbenp.prettier-vscode",
"jpoissonnier.vscode-styled-components",
"mikestead.dotenv",
"ms-azuretools.vscode-docker",
"ryanluker.vscode-coverage-gutters"
]
"eslint.codeActionsOnSave.mode": "all",
"eslint.format.enable": true,
"eslint.packageManager": "yarn",
"[css]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
},
"[json]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
}
}
```
......@@ -164,6 +151,56 @@ echo fs.inotify.max_user_watches=524288 | sudo tee -a /etc/sysctl.conf && sudo s
---
## Common Tasks
### Database backup in production
```sh
yarn db:backup
```
will dump your Docker database generating a local `./backups/YYYY_MM_DD.sql` PosgreSQL dump file.
### Database restore in production
Supposing you have, for example, a `./backups/2021_12_10.sql` PosgreSQL dump file:
```sh
yarn db:restore 2021_12_10
```
will automatically restore this backup into your Docker database.
### Database snapshot update in development
To fasten CI and dev setup with real production PostgreSQL data, we use snapshots that are in fact
PostgreSQL dump files (taken from production backups) in which we inject a fake administrator and
contributor generated in order to use them locally (without the need to import the production
`PGRST_JWT_SECRET` environment variable which would pose a security threat).
First run a db:backup in production and download this file locally in `./backups`, then run:
```sh
yarn db:snapshot:update YYYY_MM_DD
```
This command will update `./db/snapshot.sql` with anonymzed users data and real password replaced by
fake ones. This file should be included in your git commits since this file must be shared between
developers and is also used for CI tests.
### Database snapshot restore in development
See the explanations above to understand the purpose of a database snapshot.
You shouldn't have to run this command alone since it's already run when you run a `yarn setup` but
in some case you can manually run it via:
```sh
yarn db:snapshot:restore
```
---
## Naming Guidelines
### API-related methods
......@@ -220,7 +257,7 @@ interface {
Each commit message consists of a **type**, a **scope** and a **subject**:
```
```text
<type>(<scope>): <subject>
```
......
# Backups
This folder is reserved to PostgreSQL production backups (dump files) that are generated via
`yarn db:backup`.
They can be automatically restored via `yarn db:restore YYYY-MM-DDThh-mm-ss.SSSZ` (ISO Date string).
const ora = require("ora");
const postgrester = require("postgrester");
global.postgresterClient = postgrester.create({
axiosConfig: {
baseURL: "https://contributions-api.codedutravail.fabrique.social.gouv.fr",
},
});
global.spinner = ora();
exports.seed = async knex => {
global.spinner.start(`Emptying tables...`);
global.spinner.start(`Emptying tables: api.answers_comments`);
await knex("api.answers_comments").del();
global.spinner.start(`Emptying tables: api.answers_references`);
await knex("api.answers_references").del();
global.spinner.start(`Emptying tables: api.answers`);
await knex("api.answers").del();
global.spinner.start(`Emptying tables: api.questions`);
await knex("api.questions").del();
global.spinner.start(`Emptying tables: api.logs`);
await knex("api.logs").del();
global.spinner.start(`Emptying tables: users_agreements`);
await knex("users_agreements").del();
global.spinner.start(`Emptying tables: auth.users`);
await knex("auth.users").del();
global.spinner.start(`Emptying tables: api.locations_agreements`);
await knex("api.locations_agreements").del();
global.spinner.start(`Emptying tables: api.locations`);
await knex("api.locations").del();
global.spinner.start(`Emptying tables: api.agreements`);
await knex("api.agreements").del();
global.spinner.succeed(`Tables emptied.`);
};
exports.seed = async knex => {
global.spinner.start(`Generating agreements...`);
const { data: allAgreements } = await global.postgresterClient.get("/agreements");
const { data: locationsAgreements } = await global.postgresterClient.get("/locations_agreements");
const activeAgreementIds = locationsAgreements.map(({ agreement_id }) => agreement_id);
const agreements = allAgreements.filter(({ id }) => activeAgreementIds.includes(id));
await knex("api.agreements").insert(agreements);
global.spinner.succeed(`Agreements generated.`);
};
exports.seed = async knex => {
global.spinner.start(`Generating locations...`);
const { data: locations } = await global.postgresterClient.get("/locations");
await knex("api.locations").insert(locations);
const { data: locationsAgreements } = await global.postgresterClient.get("/locations_agreements");
await knex("api.locations_agreements").insert(locationsAgreements);
global.spinner.succeed(`Locations generated.`);
};
exports.seed = async knex => {
global.spinner.start(`Generating users...`);
const locationIds = (await knex("api.locations").select("id")).map(({ id }) => id);
global.users = [
{
email: "doris@sea.com",
id: "00000000-0000-4000-8000-000000000401",
location_id: locationIds[0],
name: "Doris L'Administratrice",
password: "Azerty123",
role: "administrator",
},
{
email: "nemo@sea.com",
id: "00000000-0000-4000-8000-000000000402",
location_id: locationIds[1],
name: "Nemo Le Contributeur",
password: "Azerty123",
role: "contributor",
},
{
email: "astrid@sea.com",
id: "00000000-0000-4000-8000-000000000403",
location_id: locationIds[2],
name: "Astrid La Contributrice",
password: "Azerty123",
role: "contributor",
},
{
email: "marin@sea.com",
id: "00000000-0000-4000-8000-000000000404",
location_id: locationIds[3],
name: "Marin Le Contributeur",
password: "Azerty123",
role: "contributor",
},
{
email: "deb@sea.com",
id: "00000000-0000-4000-8000-000000000405",
location_id: locationIds[3],
name: "Deb L'Administratrice Régionale",
password: "Azerty123",
role: "regional_administrator",
},
];
await knex("auth.users").insert(global.users);
await knex("users_agreements").insert(
(await knex("api.agreements").limit(3)).map(({ id: agreement_id }) => ({
agreement_id,
user_id: "00000000-0000-4000-8000-000000000402",
})),
);
await knex("users_agreements").insert(
(await knex("api.agreements").limit(3)).map(({ id: agreement_id }) => ({
agreement_id,
user_id: "00000000-0000-4000-8000-000000000403",
})),
);
await knex("users_agreements").insert(
(await knex("api.agreements").limit(2).offset(3)).map(({ id: agreement_id }) => ({
agreement_id,
user_id: "00000000-0000-4000-8000-000000000404",
})),
);
global.spinner.succeed(`Users generated.`);
};
exports.seed = async knex => {
global.spinner.start(`Generating questions...`);
const { data: questions } = await global.postgresterClient.get("/questions");
await knex("api.questions").insert(questions);
global.spinner.succeed(`Questions generated.`);
};
const DumDum = require("dumdum");
// const LABOR_LAW_ARTICLES = require("../../../packages/api/data/labor-code.json");
const dumdum = DumDum.create({ locale: "fr" });
// const LEGAL_REFERENCE_CATEGORY = [null, "agreement", "labor_code"];
// const LABOR_LAW_ARTICLES_LENGTH = LABOR_LAW_ARTICLES.length;
// function getRandomAnswerReference(answerId, category) {
// const answerReference = {
// answer_id: answerId,
// category,
// };
// switch (category) {
// case LEGAL_REFERENCE_CATEGORY[0]:
// return {
// ...answerReference,
// url: Math.random() < 0.5 ? "https://example.com" : null,
// value: dumdum.text([12, 120]),
// };
// case LEGAL_REFERENCE_CATEGORY[1]:
// return {
// ...answerReference,
// url: null,
// value: `Article ${Math.ceil(Math.random() * 99)}`,
// };
// case LEGAL_REFERENCE_CATEGORY[2]:
// return {
// ...answerReference,
// dila_id: LABOR_LAW_ARTICLES[Math.floor(Math.random() * LABOR_LAW_ARTICLES_LENGTH)].id,
// url: null,
// value: LABOR_LAW_ARTICLES[Math.floor(Math.random() * LABOR_LAW_ARTICLES_LENGTH)].id,
// };
// }
// }
// function getRandomAnswerReferences(answerId) {
// const answerReferences = [];
// let i = Math.floor(Math.random() * 10);
// while (i-- > 0) {
// const categoryIndex = Math.floor(Math.random() * 3);
// const category = LEGAL_REFERENCE_CATEGORY[categoryIndex];
// answerReferences.push(getRandomAnswerReference(answerId, category));
// }
// return answerReferences;
// }
function getRandomAnswer(diceBalance, question_id, agreement_id) {
const dice = Math.random();
switch (true) {
case dice < diceBalance[0]:
return {
agreement_id,
prevalue: "",
question_id,
state: "todo",
value: "",
};
case dice < diceBalance[1]:
return {
agreement_id,
prevalue: dumdum.text([260, 620]),
question_id,
state: "draft",
user_id: "00000000-0000-4000-8000-000000000402",
value: "",
};
case dice < diceBalance[2]:
return {
agreement_id,
prevalue: dumdum.text([260, 620]),
question_id,
state: "pending_review",
user_id: "00000000-0000-4000-8000-000000000402",
value: "",
};
case dice < diceBalance[3]:
return {
agreement_id,
prevalue: dumdum.text([260, 620]),
question_id,
state: "under_review",
user_id: "00000000-0000-4000-8000-000000000402",
value: dumdum.text([260, 620]),
};
case dice < diceBalance[4]:
return {
agreement_id,
prevalue: dumdum.text([260, 620]),
question_id,
state: "validated",
user_id: "00000000-0000-4000-8000-000000000402",
value: dumdum.text([260, 620]),
};
default:
return {
agreement_id,
prevalue: dumdum.text([260, 620]),
question_id,
state: "published",
user_id: "00000000-0000-4000-8000-000000000402",
value: dumdum.text([260, 620]),
};
}
}
exports.seed = async knex => {
global.spinner.start(`Generating answers...`);
const questions = await knex("api.questions").orderBy("index");
const agreements = await knex("api.agreements").orderBy("idcc");
const { data: publicAnswers } = await global.postgresterClient.get("/public_answers");
for (const question of questions) {
const genericAnswer = {
agreement_id: null,
prevalue: "",
question_id: question.id,
state: "pending_review",
value: dumdum.text([260, 620]),
};
await knex("api.answers").insert([genericAnswer]);
const answers = [];
let answersReferences = [];
const diceBalance = [
Math.random(),
Math.random(),
Math.random(),
Math.random(),
Math.random(),
].sort();
for (const agreement of agreements) {
global.spinner.text = `Generating answers: [${agreement.idcc}] ${question.index}) ${question.value}`;
const maybeAnswer = publicAnswers.find(
({ agreement_id, question_id }) =>
agreement_id === agreement.id && question_id === question.id,
);
if (maybeAnswer !== undefined) {
const answer = maybeAnswer;
answers.push({
...answer,
state: "validated",
});
const { data: foundAnswerReferences } = await global.postgresterClient
.eq("answer_id", answer.id)
.get("/answers_references");
answersReferences = answersReferences
.concat(foundAnswerReferences)
// eslint-disable-next-line no-unused-vars
.map(({ is_skipped, ...answerReference }) => answerReference);
continue;
}
answers.push(getRandomAnswer(diceBalance, question.id, agreement.id));
}
await knex("api.answers").insert(answers);
await knex("api.answers_references").insert(answersReferences);
}
// const answerIds = await knex("api.answers").map(({ id }) => id);
// global.spinner.text = `Generating answers references...`;
// const randomAnswersReferences = answerIds.reduce(
// (prev, answerId) => [...prev, ...getRandomAnswerReferences(answerId)],
// [],
// );
// await knex("api.answers_references").insert(randomAnswersReferences);
global.spinner.succeed(`Answers generated.`);
};
const ACTIONS = ["delete", "patch", "post"];
function getRandomIntBetween(min, max) {
min = Math.ceil(min);
max = Math.floor(max);
return Math.floor(Math.random() * (max - min + 1)) + min;
}
function getRandomIp() {
return Array.from({ length: 4 }, () => getRandomIntBetween(0, 255)).join(".");
}
function getRandomLog() {
const created_at = new Date(Date.now() - getRandomIntBetween(0, 30 * 24 * 60 * 60 * 1000));
const ip = getRandomIp();
const method = ACTIONS[getRandomIntBetween(0, 2)];
const path = "/dummy-path";
const user_id = `00000000-0000-4000-8000-00000000040${getRandomIntBetween(1, 5)}`;
return {
created_at,
ip,
method,
path,
user_id,
};
}
exports.seed = async knex => {
global.spinner.start(`Generating logs...`);
const logs = Array.from({ length: 100 }, getRandomLog);
await knex("public.logs").insert(logs);
global.spinner.succeed(`Logs generated.`);
};
const ora = require("ora");
const postgrester = require("postgrester");
global.postgresterClient = postgrester.create({
axiosConfig: {
baseURL: "https://contributions-api.codedutravail.fabrique.social.gouv.fr",
},
});
global.spinner = ora();
const DumDum = require("dumdum");
const AGREEMENTS = [
{ dila_container_id: "KALICONT000005635624", idcc: "0016" },
{ dila_container_id: "KALICONT000005635184", idcc: "0176" },
{ dila_container_id: "KALICONT000005635872", idcc: "0275" },
{ dila_container_id: "KALICONT000005635918", idcc: "1672" },
{ dila_container_id: "KALICONT000027172335", idcc: "3043" },
];
const VERSIONS = ["1.2.3", "4.5.6", "7.8.9"];
function getRandomIntInclusive(min, max) {
min = Math.ceil(min);
max = Math.floor(max);
return Math.floor(Math.random() * (max - min + 1) + min);
}
function getRandomDilaId() {
return `KALIARTI9999${getRandomIntInclusive(10000000, 99999999)}`;
}
function getRandomAgreementIdcc() {
return AGREEMENTS[Math.floor(Math.random() * AGREEMENTS.length)].idcc;
}
exports.seed = async knex => {
global.spinner.start(`Generating alerts...`);