Initial version

This commit is contained in:
Flatlogic Bot 2026-03-24 09:55:44 +00:00
commit 08851d4d87
571 changed files with 166937 additions and 0 deletions

305
.cursorrules Normal file
View File

@ -0,0 +1,305 @@
# Cursor Rules - Group 1: Development Philosophy & Coding Conventions
1. Overall Architecture & Structure:
- Enforce a clear separation of concerns between the backend and the frontend:
- **Backend**: Use Express for routing, Passport for authentication, and Swagger for API documentation. Organize code into modules such as routes, services, and helpers.
- **Example**:
- Routes: `src/routes/auth.js` for authentication routes.
- Services: `src/services/auth.js` for authentication logic.
- Helpers: `src/helpers/wrapAsync.js` for wrapping asynchronous functions.
- **Frontend**: Use Next.js with React and TypeScript. Structure components using functional components, hooks, and layouts.
- **Example**:
- Pages: `pages/index.tsx` for the main page.
- Components: `components/Header.tsx` for the header component.
- Layouts: `layouts/MainLayout.tsx` for common page layouts.
- Ensure that backend modules and frontend components are organized for reusability and maintainability:
- **Backend**: Separate business logic into services and use middleware for common tasks.
- **Frontend**: Use reusable components and hooks to manage state and lifecycle.
2. Coding Style & Formatting:
- For the backend (JavaScript):
• Use ES6+ features (const/let, arrow functions) consistently.
• Follow Prettier and ESLint configurations (e.g., consistent 2-space indentation, semicolons, and single quotes).
• Maintain clear asynchronous patterns with helper wrappers (e.g., wrapAsync).
- **Example from auth.js**:
```javascript
router.post('/signin/local', wrapAsync(async (req, res) => {
const payload = await AuthService.signin(req.body.email, req.body.password, req);
res.status(200).send(payload);
}));
```
• Document API endpoints with inline Swagger comments to ensure API clarity and consistency.
- **Example**:
```javascript
/**
* @swagger
* /api/auth/signin:
* post:
* summary: Sign in a user
* responses:
* 200:
* description: Successful login
*/
```
- For the frontend (TypeScript/React):
• Use functional components with strict typing and separation of concerns.
- **Example**:
```typescript
const Button: React.FC<{ onClick: () => void }> = ({ onClick }) => (
<button onClick={onClick}>Click me</button>
);
```
• Follow naming conventions: PascalCase for components and types/interfaces, camelCase for variables, hooks, and function names.
- **Example**:
```typescript
const useCustomHook = () => {
const [state, setState] = useState(false);
return [state, setState];
};
```
• Utilize hooks (useEffect, useState) to manage state and lifecycle in a clear and concise manner.
- **Example**:
```typescript
useEffect(() => {
console.log('Component mounted');
}, []);
```
3. Code Quality & Best Practices:
- Ensure code modularity by splitting complex logic into smaller, testable units.
- **Example**: In `auth.js`, routes are separated from business logic, which is handled in `AuthService`.
- Write self-documenting code and add comments where the logic is non-trivial.
- **Example**: Use descriptive function and variable names in `auth.js`, and add comments for complex asynchronous operations.
- Embrace declarative programming and adhere to SOLID principles.
- **Example**: In service functions, ensure each function has a single responsibility and dependencies are injected rather than hardcoded.
4. Consistency & Tools Integration:
- Leverage existing tools like Prettier and ESLint to automatically enforce style and formatting rules.
- **Example**: Use `.prettierrc` and `.eslintrc.cjs` for configuration in your project.
- Use TypeScript in the frontend to ensure type safety and catch errors early.
- **Example**: Define interfaces and types in your React components to enforce strict typing.
- Maintain uniformity in API design and error handling strategies.
- **Example**: Consistently use Passport for authentication and a common error handling middleware in `auth.js`.
## Group 2 Naming Conventions
1. File Naming and Structure:
• Frontend:
- Page Files: Use lower-case filenames (e.g., index.tsx) as prescribed by Next.js conventions.
- **Example**: `pages/index.tsx`, `pages/about.tsx`
- Component Files: Use PascalCase for React component files (e.g., WebSiteHeader.tsx, NavBar.tsx).
- **Example**: `components/Header.tsx`, `components/Footer.tsx`
- Directories: Use clear, descriptive names (e.g., 'pages', 'components', 'WebPageComponents').
- **Example**: `src/pages`, `src/components`
• Backend:
- Use lower-case filenames for modules (e.g., index.js, auth.js, projects.js).
- **Example**: `routes/auth.js`, `services/user.js`
- When needed, use hyphenation for clarity, but maintain consistency.
- **Example**: `helpers/wrap-async.js`
2. Component and Module Naming:
• Frontend:
- React Components: Define components in PascalCase.
- TypeScript Interfaces/Types: Use PascalCase (e.g., WebSiteHeaderProps).
• Backend:
- Classes (if any) and constructors should be in PascalCase; most helper functions and modules use camelCase.
3. Variable, Function, and Hook Naming:
• Use camelCase for variables and function names in both frontend and backend.
- **Example**:
```javascript
const userName = 'John Doe';
function handleLogin() { ... }
```
• Custom Hooks: Prefix with 'use' (e.g., useAuth, useForm).
- **Example**:
```typescript
const useAuth = () => {
const [isAuthenticated, setIsAuthenticated] = useState(false);
return { isAuthenticated, setIsAuthenticated };
};
```
4. Consistency and Readability:
• Maintain uniform naming across the project to ensure clarity and ease of maintenance.
- **Example**: Use consistent naming conventions for variables, functions, and components, such as camelCase for variables and functions, and PascalCase for components.
- **Example**: In `auth.js`, ensure that all function names clearly describe their purpose, such as `handleLogin` or `validateUserInput`.
## Group 3 Frontend & React Best Practices
1. Use of Functional Components & TypeScript:
• Build all components as functional components.
- **Example**:
```typescript
const Header: React.FC = () => {
return <header>Header Content</header>;
};
```
• Leverage TypeScript for static type checking and enforce strict prop and state types.
- **Example**:
```typescript
interface ButtonProps {
onClick: () => void;
}
const Button: React.FC<ButtonProps> = ({ onClick }) => (
<button onClick={onClick}>Click me</button>
);
```
2. Effective Use of React Hooks:
• Utilize useState and useEffect appropriately with proper dependency arrays.
- **Example**:
```typescript
const [count, setCount] = useState(0);
useEffect(() => {
console.log('Component mounted');
}, []);
```
• Create custom hooks to encapsulate shared logic (e.g., useAppSelector).
- **Example**:
```typescript
const useAuth = () => {
const [isAuthenticated, setIsAuthenticated] = useState(false);
return { isAuthenticated, setIsAuthenticated };
};
```
3. Component Composition & Separation of Concerns:
• Separate presentational (stateless) components from container components managing logic.
- **Example**: Use `LayoutGuest` to encapsulate common page structures.
4. Code Quality & Readability:
• Maintain consistent formatting and adhere to Prettier and ESLint rules.
• Use descriptive names for variables, functions, and components.
• Document non-trivial logic with inline comments and consider implementing error boundaries where needed.
• New code must adhere to these conventions to avoid ambiguity.
• Use descriptive names that reflect the purpose and domain, avoiding abbreviations unless standard in the project.
## Group 4 Backend & API Guidelines
1. API Endpoint Design & Documentation:
• Follow RESTful naming conventions; all route handlers should be named clearly and consistently.
- **Example**: Use verbs like `GET`, `POST`, `PUT`, `DELETE` to define actions, e.g., `GET /api/auth/me` to retrieve user info.
• Document endpoints with Swagger annotations to provide descriptions, expected request bodies, and response codes.
- **Example**:
```javascript
/**
* @swagger
* /api/auth/signin:
* post:
* summary: Sign in a user
* requestBody:
* description: User credentials
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Auth"
* responses:
* 200:
* description: Successful login
* 400:
* description: Invalid username/password supplied
*/
```
• Examples (for Auth endpoints):
- POST /api/auth/signin/local
• Description: Logs the user into the system.
• Request Body (application/json):
{ "email": "admin@flatlogic.com", "password": "password" }
• Responses:
- 200: Successful login (returns token and user data).
- 400: Invalid username/password supplied.
- GET /api/auth/me
• Description: Retrieves current authorized user information.
• Secured via Passport JWT; uses req.currentUser.
• Responses:
- 200: Returns current user info.
- 400: Invalid credentials or missing user data.
- POST /api/auth/signup
• Description: Registers a new user.
• Request Body (application/json):
{ "email": "admin@flatlogic.com", "password": "password" }
• Responses:
- 200: New user signed up successfully.
- 400: Invalid input supplied.
- 500: Server error.
## Group 5 Testing, Quality Assurance & Error Handling
1. Testing Guidelines:
• Write unit tests for critical backend and frontend components using frameworks such as Jest, React Testing Library, and Mocha/Chai.
- **Example**:
```javascript
test('should return user data', async () => {
const user = await getUserData();
expect(user).toHaveProperty('email');
});
```
• Practice test-driven development and maintain high test coverage.
• Regularly update tests following changes in business logic.
2. Quality Assurance:
• Enforce code quality with ESLint, Prettier, and static analysis tools.
• Integrate continuous testing workflows (CI/CD) to catch issues early.
- **Example**: Use GitHub Actions for automated testing and deployment.
• Ensure documentation is kept up-to-date with the implemented code.
3. Error Handling:
• Back-end:
- Wrap asynchronous route handlers with a helper (e.g., wrapAsync) to capture errors.
- **Example**:
```javascript
router.post('/signin', wrapAsync(async (req, res) => {
const user = await AuthService.signin(req.body);
res.send(user);
}));
```
- Use centralized error handling middleware (e.g., commonErrorHandler) for uniform error responses.
• Front-end:
- Implement error boundaries in React to gracefully handle runtime errors.
- Display user-friendly error messages and log errors for further analysis.
2. Authentication & Security:
• Protect endpoints by using Passport.js with JWT (e.g., passport.authenticate('jwt', { session: false })).
- **Example**:
```javascript
router.get('/profile', passport.authenticate('jwt', { session: false }), (req, res) => {
res.send(req.user);
});
```
• Ensure that secure routes check for existence of req.currentUser. If absent, return a ForbiddenError.
3. Consistent Error Handling & Middleware Usage:
• Wrap asynchronous route handlers with helpers like wrapAsync for error propagation.
• Use centralized error handling middleware (e.g., commonErrorHandler) to capture and format errors uniformly.
4. Modular Code Organization:
• Organize backend code into separate files for routes, services, and database access (e.g., auth.js, projects.js, tasks.js).
• Use descriptive, lowercase filenames for modules and routes.
5. Endpoint Security Best Practices:
• Validate input data and sanitize requests where necessary.
• Restrict sensitive operations to authenticated users with proper role-based permissions.
────────────────────────────────────────
Group 6 Accessibility, UI, and Styling Guidelines (Updated)
────────────────────────────────────────
1. Sidebar Styling:
• The sidebar is implemented in the authenticated layout via the AsideMenu component, with the actual element defined in AsideMenuLayer (located at frontend/src/components/AsideMenuLayer.tsx) as an <aside> element with id="asideMenu".
- **Example**:
```css
#asideMenu {
background-color: #F8F4E1 !important;
}
```
• When modifying sidebar styles, target #asideMenu and its child elements rather than generic selectors (e.g., avoid .app-sidebar) to ensure that the changes affect the actual rendered sidebar.
• Remove or override any conflicting background utilities (such as an unwanted bg-white) so our desired background color (#F8F4E1) is fully visible. Use a highly specific selector if necessary.
• Adjust spacing (padding/margins) at both the container (#asideMenu) and the individual menu item level to maintain a consistent, compact design.
2. General Project Styling and Tailwind CSS Usage:
• The application leverages Tailwind CSS extensively, with core styling defined in _theme.css using the @apply directive. Any new modifications should follow this pattern to ensure consistency.
- **Example**:
```css
.btn {
@apply bg-blue-500 text-white;
}
```
• The themed blocks (like .theme-pink and .theme-green) standardize the UI's appearance. When applying custom overrides, ensure they integrate cleanly into these structures and avoid conflicts or circular dependency errors (e.g., issues when redefining utilities such as text-blue-600).
• Adjustments via Tailwind CSS generally require modifying class names in the components and ensuring that global overrides are applied in the correct order. Consistent use of design tokens and custom color codes (e.g., #F8F4E1) throughout the app is crucial to a cohesive design.
• Specificity is key. If a change isn't visually reflected as expected, inspect the rendered HTML to identify which classes are taking precedence.

3
.dockerignore Normal file
View File

@ -0,0 +1,3 @@
backend/node_modules
frontend/node_modules
frontend/build

3
.gitignore vendored Normal file
View File

@ -0,0 +1,3 @@
node_modules/
*/node_modules/
*/build/

187
502.html Normal file
View File

@ -0,0 +1,187 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Service Starting</title>
<style>
body {
font-family: sans-serif;
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
min-height: 100vh;
background-color: #EFF2FF;
margin: 0;
padding: 20px;
}
.container {
text-align: center;
padding: 30px 40px;
background-color: #fff;
border-radius: 20px;
margin-bottom: 20px;
max-width: 538px;
width: 100%;
box-shadow: 0 13px 34px 0 rgba(167, 187, 242, 0.2);
box-sizing: border-box;
}
#status-heading {
font-size: 24px;
font-weight: 700;
color: #02004E;
margin-bottom: 20px;
}
h2 {
color: #333;
margin-bottom: 15px;
}
p {
color: #666;
font-size: 1.1em;
margin-bottom: 10px;
}
.tip {
font-weight: 300;
font-size: 17px;
line-height: 150%;
letter-spacing: 0;
text-align: center;
margin-top: 30px;
}
.loader-container {
position: relative;
display: flex;
justify-content: center;
align-items: center;
}
.loader {
width: 100px;
aspect-ratio: 1;
border-radius: 50%;
background:
radial-gradient(farthest-side, #5C7EF1 94%, #0000) top/8px 8px no-repeat,
conic-gradient(#0000 30%, #5C7EF1);
-webkit-mask: radial-gradient(farthest-side, #0000 calc(100% - 8px), #000 0);
animation: l13 2s infinite linear;
}
@keyframes l13 {
100% {
transform: rotate(1turn)
}
}
.app-logo {
position: absolute;
width: 36px;
}
.panel {
padding: 0 18px;
display: none;
background-color: white;
overflow: hidden;
margin-top: 10px;
}
.show {
display: block;
}
.project-info {
border: 1px solid #8C9DFF;
border-radius: 10px;
padding: 12px 16px;
max-width: 600px;
margin: 40px auto;
background-color: #FBFCFF;
}
.project-info h2 {
color: #02004E;
font-size: 14px;
font-weight: 500;
margin-bottom: 10px;
text-align: left;
}
.project-info p {
color: #686791;
font-size: 12px;
font-weight: 400;
text-align: left;
}
</style>
</head>
<body>
<div class="container">
<h2 id="status-heading">Loading the app, just a moment…</h2>
<p class="tip">The application is currently launching. The page will automatically refresh once site is
available.</p>
<div class="project-info">
<h2>Gold Forecasting Engine</h2>
<p>Institutional-grade gold forecasting platform with multi-horizon predictions, scenario analysis, and explainable drivers.</p>
</div>
<div class="loader-container">
<img src="https://flatlogic.com/blog/wp-content/uploads/2025/05/logo-bot-1.png" alt="App Logo"
class="app-logo">
<div class="loader"></div>
</div>
<div class="panel">
<video width="100%" height="315" controls loop>
<source
src="https://flatlogic.com/blog/wp-content/uploads/2025/04/20250430_1336_professional_dynamo_spinner_simple_compose_01jt349yvtenxt7xhg8hhr85j8.mp4"
type="video/mp4">
Your browser does not support the video tag.
</video>
</div>
</div>
<script>
function checkAvailability() {
fetch('/')
.then(response => {
if (response.ok) {
window.location.reload();
} else {
setTimeout(checkAvailability, 5000);
}
})
.catch(() => {
setTimeout(checkAvailability, 5000);
});
}
document.addEventListener('DOMContentLoaded', checkAvailability);
document.addEventListener('DOMContentLoaded', function () {
const appTitle = document.querySelector('#status-heading');
const panel = document.querySelector('.panel');
const video = panel.querySelector('video');
let clickCount = 0;
appTitle.addEventListener('click', function () {
clickCount++;
if (clickCount === 5) {
panel.classList.toggle('show');
if (panel.classList.contains('show')) {
video.play();
} else {
video.pause();
}
clickCount = 0;
}
});
});
</script>
</body>
</html>

21
Dockerfile Normal file
View File

@ -0,0 +1,21 @@
FROM node:20.15.1-alpine AS builder
RUN apk add --no-cache git
WORKDIR /app
COPY frontend/package.json frontend/yarn.lock ./
RUN yarn install --pure-lockfile
COPY frontend .
RUN yarn build
FROM node:20.15.1-alpine
WORKDIR /app
COPY backend/package.json backend/yarn.lock ./
RUN yarn install --pure-lockfile
COPY backend .
COPY --from=builder /app/build /app/public
CMD ["yarn", "start"]

85
Dockerfile.dev Normal file
View File

@ -0,0 +1,85 @@
# Base image for Node.js dependencies
FROM node:20.15.1-alpine AS frontend-deps
RUN apk add --no-cache git
WORKDIR /app/frontend
COPY frontend/package.json frontend/yarn.lock ./
RUN yarn install --pure-lockfile
FROM node:20.15.1-alpine AS backend-deps
RUN apk add --no-cache git
WORKDIR /app/backend
COPY backend/package.json backend/yarn.lock ./
RUN yarn install --pure-lockfile
FROM node:20.15.1-alpine AS app-shell-deps
RUN apk add --no-cache git
WORKDIR /app/app-shell
COPY app-shell/package.json app-shell/yarn.lock ./
RUN yarn install --pure-lockfile
# Nginx setup and application build
FROM node:20.15.1-alpine AS build
RUN apk add --no-cache git nginx curl
RUN apk add --no-cache lsof procps
RUN yarn global add concurrently
RUN apk add --no-cache \
chromium \
nss \
freetype \
harfbuzz \
ttf-freefont \
fontconfig
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium-browser
RUN mkdir -p /app/pids
# Make sure to add yarn global bin to PATH
ENV PATH /root/.yarn/bin:/root/.config/yarn/global/node_modules/.bin:$PATH
# Copy dependencies
WORKDIR /app
COPY --from=frontend-deps /app/frontend /app/frontend
COPY --from=backend-deps /app/backend /app/backend
COPY --from=app-shell-deps /app/app-shell /app/app-shell
COPY frontend /app/frontend
COPY backend /app/backend
COPY app-shell /app/app-shell
COPY docker /app/docker
# Copy all files from root to /app
COPY . /app
# Copy Nginx configuration
COPY nginx.conf /etc/nginx/nginx.conf
# Copy custom error page
COPY 502.html /usr/share/nginx/html/502.html
# Change owner and permissions of the error page
RUN chown nginx:nginx /usr/share/nginx/html/502.html && \
chmod 644 /usr/share/nginx/html/502.html
# Expose the port the app runs on
EXPOSE 8080
ENV NODE_ENV=dev_stage
ENV FRONT_PORT=3001
ENV BACKEND_PORT=3000
ENV APP_SHELL_PORT=4000
CMD ["sh", "-c", "\
yarn --cwd /app/frontend dev & echo $! > /app/pids/frontend.pid && \
yarn --cwd /app/backend start & echo $! > /app/pids/backend.pid && \
sleep 10 && nginx -g 'daemon off;' & \
NGINX_PID=$! && \
echo 'Waiting for backend (port 3000) to be available...' && \
while ! nc -z localhost ${BACKEND_PORT}; do \
sleep 2; \
done && \
echo 'Backend is up. Starting app_shell for Git check...' && \
yarn --cwd /app/app-shell start && \
wait $NGINX_PID"]

1
LICENSE Normal file
View File

@ -0,0 +1 @@
https://flatlogic.com/

244
README.md Normal file
View File

@ -0,0 +1,244 @@
# Gold Forecasting Engine
## This project was generated by [Flatlogic Platform](https://flatlogic.com).
- Frontend: [React.js](https://flatlogic.com/templates?framework%5B%5D=react&sort=default)
- Backend: [NodeJS](https://flatlogic.com/templates?backend%5B%5D=nodejs&sort=default)
<details><summary>Backend Folder Structure</summary>
The generated application has the following backend folder structure:
`src` folder which contains your working files that will be used later to create the build. The src folder contains folders as:
- `auth` - config the library for authentication and authorization;
- `db` - contains such folders as:
- `api` - documentation that is automatically generated by jsdoc or other tools;
- `migrations` - is a skeleton of the database or all the actions that users do with the database;
- `models`- what will represent the database for the backend;
- `seeders` - the entity that creates the data for the database.
- `routes` - this folder would contain all the routes that you have created using Express Router and what they do would be exported from a Controller file;
- `services` - contains such folders as `emails` and `notifications`.
</details>
- Database: PostgreSQL
- app-shel: Core application framework that provides essential infrastructure services
for the entire application.
-----------------------
### We offer 2 ways how to start the project locally: by running Frontend and Backend or with Docker.
-----------------------
## To start the project:
### Backend:
> Please change current folder: `cd backend`
#### Install local dependencies:
`yarn install`
------------
#### Adjust local db:
##### 1. Install postgres:
MacOS:
`brew install postgres`
> if you dont have brew please install it (https://brew.sh) and repeat step `brew install postgres`.
Ubuntu:
`sudo apt update`
`sudo apt install postgresql postgresql-contrib`
##### 2. Create db and admin user:
Before run and test connection, make sure you have created a database as described in the above configuration. You can use the `psql` command to create a user and database.
`psql postgres --u postgres`
Next, type this command for creating a new user with password then give access for creating the database.
`postgres-# CREATE ROLE admin WITH LOGIN PASSWORD 'admin_pass';`
`postgres-# ALTER ROLE admin CREATEDB;`
Quit `psql` then log in again using the new user that previously created.
`postgres-# \q`
`psql postgres -U admin`
Type this command to creating a new database.
`postgres=> CREATE DATABASE db_{your_project_name};`
Then give that new user privileges to the new database then quit the `psql`.
`postgres=> GRANT ALL PRIVILEGES ON DATABASE db_{your_project_name} TO admin;`
`postgres=> \q`
------------
#### Create database:
`yarn db:create`
#### Start production build:
`yarn start`
### Frontend:
> Please change current folder: `cd frontend`
## To start the project with Docker:
### Description:
The project contains the **docker folder** and the `Dockerfile`.
The `Dockerfile` is used to Deploy the project to Google Cloud.
The **docker folder** contains a couple of helper scripts:
- `docker-compose.yml` (all our services: web, backend, db are described here)
- `start-backend.sh` (starts backend, but only after the database)
- `wait-for-it.sh` (imported from https://github.com/vishnubob/wait-for-it)
> To avoid breaking the application, we recommend you don't edit the following files: everything that includes the **docker folder** and `Dokerfile`.
## Run services:
1. Install docker compose (https://docs.docker.com/compose/install/)
2. Move to `docker` folder. All next steps should be done from this folder.
``` cd docker ```
3. Make executables from `wait-for-it.sh` and `start-backend.sh`:
``` chmod +x start-backend.sh && chmod +x wait-for-it.sh ```
4. Download dependend projects for services.
5. Review the docker-compose.yml file. Make sure that all services have Dockerfiles. Only db service doesn't require a Dockerfile.
6. Make sure you have needed ports (see them in `ports`) available on your local machine.
7. Start services:
7.1. With an empty database `rm -rf data && docker-compose up`
7.2. With a stored (from previus runs) database data `docker-compose up`
8. Check http://localhost:3000
9. Stop services:
9.1. Just press `Ctr+C`
## Most common errors:
1. `connection refused`
There could be many reasons, but the most common are:
- The port is not open on the destination machine.
- The port is open on the destination machine, but its backlog of pending connections is full.
- A firewall between the client and server is blocking access (also check local firewalls).
After checking for firewalls and that the port is open, use telnet to connect to the IP/port to test connectivity. This removes any potential issues from your application.
***MacOS:***
If you suspect that your SSH service might be down, you can run this command to find out:
`sudo service ssh status`
If the command line returns a status of down, then youve likely found the reason behind your connectivity error.
***Ubuntu:***
Sometimes a connection refused error can also indicate that there is an IP address conflict on your network. You can search for possible IP conflicts by running:
`arp-scan -I eth0 -l | grep <ipaddress>`
`arp-scan -I eth0 -l | grep <ipaddress>`
and
`arping <ipaddress>`
2. `yarn db:create` creates database with the assembled tables (on MacOS with Postgres database)
The workaround - put the next commands to your Postgres database terminal:
`DROP SCHEMA public CASCADE;`
`CREATE SCHEMA public;`
`GRANT ALL ON SCHEMA public TO postgres;`
`GRANT ALL ON SCHEMA public TO public;`
Afterwards, continue to start your project in the backend directory by running:
`yarn start`

14
backend/.env Normal file
View File

@ -0,0 +1,14 @@
DB_NAME=app_39289
DB_USER=app_39289
DB_PASS=dde21f88-6a94-4690-9933-32a3b06c2977
DB_HOST=127.0.0.1
DB_PORT=5432
PORT=3000
GOOGLE_CLIENT_ID=671001533244-kf1k1gmp6mnl0r030qmvdu6v36ghmim6.apps.googleusercontent.com
GOOGLE_CLIENT_SECRET=Yo4qbKZniqvojzUQ60iKlxqR
MS_CLIENT_ID=4696f457-31af-40de-897c-e00d7d4cff73
MS_CLIENT_SECRET=m8jzZ.5UpHF3=-dXzyxiZ4e[F8OF54@p
EMAIL_USER=AKIAVEW7G4PQUBGM52OF
EMAIL_PASS=BLnD4hKGb6YkSz3gaQrf8fnyLi3C3/EdjOOsLEDTDPTz
SECRET_KEY=HUEyqESqgQ1yTwzVlO6wprC9Kf1J1xuA
PEXELS_KEY=Vc99rnmOhHhJAbgGQoKLZtsaIVfkeownoQNbTj78VemUjKh08ZYRbf18

4
backend/.eslintignore Normal file
View File

@ -0,0 +1,4 @@
# Ignore generated and runtime files
node_modules/
tmp/
logs/

15
backend/.eslintrc.cjs Normal file
View File

@ -0,0 +1,15 @@
module.exports = {
env: {
node: true,
es2021: true
},
extends: [
'eslint:recommended'
],
plugins: [
'import'
],
rules: {
'import/no-unresolved': 'error'
}
};

11
backend/.prettierrc Normal file
View File

@ -0,0 +1,11 @@
{
"singleQuote": true,
"tabWidth": 2,
"printWidth": 80,
"trailingComma": "all",
"quoteProps": "as-needed",
"jsxSingleQuote": true,
"bracketSpacing": true,
"bracketSameLine": false,
"arrowParens": "always"
}

7
backend/.sequelizerc Normal file
View File

@ -0,0 +1,7 @@
const path = require('path');
module.exports = {
"config": path.resolve("src", "db", "db.config.js"),
"models-path": path.resolve("src", "db", "models"),
"seeders-path": path.resolve("src", "db", "seeders"),
"migrations-path": path.resolve("src", "db", "migrations")
};

23
backend/Dockerfile Normal file
View File

@ -0,0 +1,23 @@
FROM node:20.15.1-alpine
RUN apk update && apk add bash
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm@5+)
COPY package*.json ./
RUN yarn install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "yarn", "start" ]

56
backend/README.md Normal file
View File

@ -0,0 +1,56 @@
#Gold Forecasting Engine - template backend,
#### Run App on local machine:
##### Install local dependencies:
- `yarn install`
------------
##### Adjust local db:
###### 1. Install postgres:
- MacOS:
- `brew install postgres`
- Ubuntu:
- `sudo apt update`
- `sudo apt install postgresql postgresql-contrib`
###### 2. Create db and admin user:
- Before run and test connection, make sure you have created a database as described in the above configuration. You can use the `psql` command to create a user and database.
- `psql postgres --u postgres`
- Next, type this command for creating a new user with password then give access for creating the database.
- `postgres-# CREATE ROLE admin WITH LOGIN PASSWORD 'admin_pass';`
- `postgres-# ALTER ROLE admin CREATEDB;`
- Quit `psql` then log in again using the new user that previously created.
- `postgres-# \q`
- `psql postgres -U admin`
- Type this command to creating a new database.
- `postgres=> CREATE DATABASE db_gold_forecasting_engine;`
- Then give that new user privileges to the new database then quit the `psql`.
- `postgres=> GRANT ALL PRIVILEGES ON DATABASE db_gold_forecasting_engine TO admin;`
- `postgres=> \q`
------------
#### Api Documentation (Swagger)
http://localhost:8080/api-docs (local host)
http://host_name/api-docs
------------
##### Setup database tables or update after schema change
- `yarn db:migrate`
##### Seed the initial data (admin accounts, relevant for the first setup):
- `yarn db:seed`
##### Start build:
- `yarn start`

56
backend/package.json Normal file
View File

@ -0,0 +1,56 @@
{
"name": "goldforecastingengine",
"description": "Gold Forecasting Engine - template backend",
"scripts": {
"start": "npm run db:migrate && npm run db:seed && npm run watch",
"lint": "eslint . --ext .js",
"db:migrate": "sequelize-cli db:migrate",
"db:seed": "sequelize-cli db:seed:all",
"db:drop": "sequelize-cli db:drop",
"db:create": "sequelize-cli db:create",
"watch": "node watcher.js"
},
"dependencies": {
"@google-cloud/storage": "^5.18.2",
"axios": "^1.6.7",
"bcrypt": "5.1.1",
"chokidar": "^4.0.3",
"cors": "2.8.5",
"csv-parser": "^3.0.0",
"express": "4.18.2",
"formidable": "1.2.2",
"helmet": "4.1.1",
"json2csv": "^5.0.7",
"jsonwebtoken": "8.5.1",
"lodash": "4.17.21",
"moment": "2.30.1",
"multer": "^1.4.4",
"mysql2": "2.2.5",
"nodemailer": "6.9.9",
"passport": "^0.7.0",
"passport-google-oauth2": "^0.2.0",
"passport-jwt": "^4.0.1",
"passport-microsoft": "^0.1.0",
"pg": "8.4.1",
"pg-hstore": "2.3.4",
"sequelize": "6.35.2",
"sequelize-json-schema": "^2.1.1",
"sqlite": "4.0.15",
"swagger-jsdoc": "^6.2.8",
"swagger-ui-express": "^5.0.0",
"tedious": "^18.2.4"
},
"engines": {
"node": ">=18"
},
"private": true,
"devDependencies": {
"cross-env": "7.0.3",
"eslint": "^8.23.1",
"eslint-plugin-import": "^2.29.1",
"mocha": "8.1.3",
"node-mocks-http": "1.9.0",
"nodemon": "2.0.5",
"sequelize-cli": "6.6.2"
}
}

View File

@ -0,0 +1,484 @@
"use strict";
const fs = require("fs");
const path = require("path");
const http = require("http");
const https = require("https");
const { URL } = require("url");
let CONFIG_CACHE = null;
class LocalAIApi {
static createResponse(params, options) {
return createResponse(params, options);
}
static request(pathValue, payload, options) {
return request(pathValue, payload, options);
}
static fetchStatus(aiRequestId, options) {
return fetchStatus(aiRequestId, options);
}
static awaitResponse(aiRequestId, options) {
return awaitResponse(aiRequestId, options);
}
static extractText(response) {
return extractText(response);
}
static decodeJsonFromResponse(response) {
return decodeJsonFromResponse(response);
}
}
async function createResponse(params, options = {}) {
const payload = { ...(params || {}) };
if (!Array.isArray(payload.input) || payload.input.length === 0) {
return {
success: false,
error: "input_missing",
message: 'Parameter "input" is required and must be a non-empty array.',
};
}
const cfg = config();
if (!payload.model) {
payload.model = cfg.defaultModel;
}
const initial = await request(options.path, payload, options);
if (!initial.success) {
return initial;
}
const data = initial.data;
if (data && typeof data === "object" && data.ai_request_id) {
const pollTimeout = Number(options.poll_timeout ?? 300);
const pollInterval = Number(options.poll_interval ?? 5);
return await awaitResponse(data.ai_request_id, {
interval: pollInterval,
timeout: pollTimeout,
headers: options.headers,
timeout_per_call: options.timeout,
verify_tls: options.verify_tls,
});
}
return initial;
}
async function request(pathValue, payload = {}, options = {}) {
const cfg = config();
const resolvedPath = pathValue || options.path || cfg.responsesPath;
if (!resolvedPath) {
return {
success: false,
error: "project_id_missing",
message: "PROJECT_ID is not defined; cannot resolve AI proxy endpoint.",
};
}
if (!cfg.projectUuid) {
return {
success: false,
error: "project_uuid_missing",
message: "PROJECT_UUID is not defined; aborting AI request.",
};
}
const bodyPayload = { ...(payload || {}) };
if (!bodyPayload.project_uuid) {
bodyPayload.project_uuid = cfg.projectUuid;
}
const url = buildUrl(resolvedPath, cfg.baseUrl);
const timeout = resolveTimeout(options.timeout, cfg.timeout);
const verifyTls = resolveVerifyTls(options.verify_tls, cfg.verifyTls);
const headers = {
Accept: "application/json",
"Content-Type": "application/json",
[cfg.projectHeader]: cfg.projectUuid,
};
if (Array.isArray(options.headers)) {
for (const header of options.headers) {
if (typeof header === "string" && header.includes(":")) {
const [name, value] = header.split(":", 2);
headers[name.trim()] = value.trim();
}
}
}
const body = JSON.stringify(bodyPayload);
return sendRequest(url, "POST", body, headers, timeout, verifyTls);
}
async function fetchStatus(aiRequestId, options = {}) {
const cfg = config();
if (!cfg.projectUuid) {
return {
success: false,
error: "project_uuid_missing",
message: "PROJECT_UUID is not defined; aborting status check.",
};
}
const statusPath = resolveStatusPath(aiRequestId, cfg);
const url = buildUrl(statusPath, cfg.baseUrl);
const timeout = resolveTimeout(options.timeout, cfg.timeout);
const verifyTls = resolveVerifyTls(options.verify_tls, cfg.verifyTls);
const headers = {
Accept: "application/json",
[cfg.projectHeader]: cfg.projectUuid,
};
if (Array.isArray(options.headers)) {
for (const header of options.headers) {
if (typeof header === "string" && header.includes(":")) {
const [name, value] = header.split(":", 2);
headers[name.trim()] = value.trim();
}
}
}
return sendRequest(url, "GET", null, headers, timeout, verifyTls);
}
async function awaitResponse(aiRequestId, options = {}) {
const timeout = Number(options.timeout ?? 300);
const interval = Math.max(Number(options.interval ?? 5), 1);
const deadline = Date.now() + Math.max(timeout, interval) * 1000;
while (true) {
const statusResp = await fetchStatus(aiRequestId, {
headers: options.headers,
timeout: options.timeout_per_call,
verify_tls: options.verify_tls,
});
if (statusResp.success) {
const data = statusResp.data || {};
if (data && typeof data === "object") {
if (data.status === "success") {
return {
success: true,
status: 200,
data: data.response || data,
};
}
if (data.status === "failed") {
return {
success: false,
status: 500,
error: String(data.error || "AI request failed"),
data,
};
}
}
} else {
return statusResp;
}
if (Date.now() >= deadline) {
return {
success: false,
error: "timeout",
message: "Timed out waiting for AI response.",
};
}
await sleep(interval * 1000);
}
}
function extractText(response) {
const payload = response && typeof response === "object" ? response.data || response : null;
if (!payload || typeof payload !== "object") {
return "";
}
if (Array.isArray(payload.output)) {
let combined = "";
for (const item of payload.output) {
if (!item || !Array.isArray(item.content)) {
continue;
}
for (const block of item.content) {
if (
block &&
typeof block === "object" &&
block.type === "output_text" &&
typeof block.text === "string" &&
block.text.length > 0
) {
combined += block.text;
}
}
}
if (combined) {
return combined;
}
}
if (
payload.choices &&
payload.choices[0] &&
payload.choices[0].message &&
typeof payload.choices[0].message.content === "string"
) {
return payload.choices[0].message.content;
}
return "";
}
function decodeJsonFromResponse(response) {
const text = extractText(response);
if (!text) {
throw new Error("No text found in AI response.");
}
const parsed = parseJson(text);
if (parsed.ok && parsed.value && typeof parsed.value === "object") {
return parsed.value;
}
const stripped = stripJsonFence(text);
if (stripped !== text) {
const parsedStripped = parseJson(stripped);
if (parsedStripped.ok && parsedStripped.value && typeof parsedStripped.value === "object") {
return parsedStripped.value;
}
throw new Error(`JSON parse failed after stripping fences: ${parsedStripped.error}`);
}
throw new Error(`JSON parse failed: ${parsed.error}`);
}
function config() {
if (CONFIG_CACHE) {
return CONFIG_CACHE;
}
ensureEnvLoaded();
const baseUrl = process.env.AI_PROXY_BASE_URL || "https://flatlogic.com";
const projectId = process.env.PROJECT_ID || null;
let responsesPath = process.env.AI_RESPONSES_PATH || null;
if (!responsesPath && projectId) {
responsesPath = `/projects/${projectId}/ai-request`;
}
const timeout = resolveTimeout(process.env.AI_TIMEOUT, 30);
const verifyTls = resolveVerifyTls(process.env.AI_VERIFY_TLS, true);
CONFIG_CACHE = {
baseUrl,
responsesPath,
projectId,
projectUuid: process.env.PROJECT_UUID || null,
projectHeader: process.env.AI_PROJECT_HEADER || "project-uuid",
defaultModel: process.env.AI_DEFAULT_MODEL || "gpt-5-mini",
timeout,
verifyTls,
};
return CONFIG_CACHE;
}
function buildUrl(pathValue, baseUrl) {
const trimmed = String(pathValue || "").trim();
if (trimmed === "") {
return baseUrl;
}
if (trimmed.startsWith("http://") || trimmed.startsWith("https://")) {
return trimmed;
}
if (trimmed.startsWith("/")) {
return `${baseUrl}${trimmed}`;
}
return `${baseUrl}/${trimmed}`;
}
function resolveStatusPath(aiRequestId, cfg) {
const basePath = (cfg.responsesPath || "").replace(/\/+$/, "");
if (!basePath) {
return `/ai-request/${encodeURIComponent(String(aiRequestId))}/status`;
}
const normalized = basePath.endsWith("/ai-request") ? basePath : `${basePath}/ai-request`;
return `${normalized}/${encodeURIComponent(String(aiRequestId))}/status`;
}
function sendRequest(urlString, method, body, headers, timeoutSeconds, verifyTls) {
return new Promise((resolve) => {
let targetUrl;
try {
targetUrl = new URL(urlString);
} catch (err) {
resolve({
success: false,
error: "invalid_url",
message: err.message,
});
return;
}
const isHttps = targetUrl.protocol === "https:";
const requestFn = isHttps ? https.request : http.request;
const options = {
protocol: targetUrl.protocol,
hostname: targetUrl.hostname,
port: targetUrl.port || (isHttps ? 443 : 80),
path: `${targetUrl.pathname}${targetUrl.search}`,
method: method.toUpperCase(),
headers,
timeout: Math.max(Number(timeoutSeconds || 30), 1) * 1000,
};
if (isHttps) {
options.rejectUnauthorized = Boolean(verifyTls);
}
const req = requestFn(options, (res) => {
let responseBody = "";
res.setEncoding("utf8");
res.on("data", (chunk) => {
responseBody += chunk;
});
res.on("end", () => {
const status = res.statusCode || 0;
const parsed = parseJson(responseBody);
const payload = parsed.ok ? parsed.value : responseBody;
if (status >= 200 && status < 300) {
const result = {
success: true,
status,
data: payload,
};
if (!parsed.ok) {
result.json_error = parsed.error;
}
resolve(result);
return;
}
const errorMessage =
parsed.ok && payload && typeof payload === "object"
? String(payload.error || payload.message || "AI proxy request failed")
: String(responseBody || "AI proxy request failed");
resolve({
success: false,
status,
error: errorMessage,
response: payload,
json_error: parsed.ok ? undefined : parsed.error,
});
});
});
req.on("timeout", () => {
req.destroy(new Error("request_timeout"));
});
req.on("error", (err) => {
resolve({
success: false,
error: "request_failed",
message: err.message,
});
});
if (body) {
req.write(body);
}
req.end();
});
}
function parseJson(value) {
if (typeof value !== "string" || value.trim() === "") {
return { ok: false, error: "empty_response" };
}
try {
return { ok: true, value: JSON.parse(value) };
} catch (err) {
return { ok: false, error: err.message };
}
}
function stripJsonFence(text) {
const trimmed = text.trim();
if (trimmed.startsWith("```json")) {
return trimmed.replace(/^```json/, "").replace(/```$/, "").trim();
}
if (trimmed.startsWith("```")) {
return trimmed.replace(/^```/, "").replace(/```$/, "").trim();
}
return text;
}
function resolveTimeout(value, fallback) {
const parsed = Number.parseInt(String(value ?? fallback), 10);
return Number.isNaN(parsed) ? Number(fallback) : parsed;
}
function resolveVerifyTls(value, fallback) {
if (value === undefined || value === null) {
return Boolean(fallback);
}
return String(value).toLowerCase() !== "false" && String(value) !== "0";
}
function ensureEnvLoaded() {
if (process.env.PROJECT_UUID && process.env.PROJECT_ID) {
return;
}
const envPath = path.resolve(__dirname, "../../../../.env");
if (!fs.existsSync(envPath)) {
return;
}
let content;
try {
content = fs.readFileSync(envPath, "utf8");
} catch (err) {
throw new Error(`Failed to read executor .env: ${err.message}`);
}
for (const line of content.split(/\r?\n/)) {
const trimmed = line.trim();
if (!trimmed || trimmed.startsWith("#") || !trimmed.includes("=")) {
continue;
}
const [rawKey, ...rest] = trimmed.split("=");
const key = rawKey.trim();
if (!key) {
continue;
}
const value = rest.join("=").trim().replace(/^['"]|['"]$/g, "");
if (!process.env[key]) {
process.env[key] = value;
}
}
}
function sleep(ms) {
return new Promise((resolve) => setTimeout(resolve, ms));
}
module.exports = {
LocalAIApi,
createResponse,
request,
fetchStatus,
awaitResponse,
extractText,
decodeJsonFromResponse,
};

68
backend/src/auth/auth.js Normal file
View File

@ -0,0 +1,68 @@
const config = require('../config');
const providers = config.providers;
const helpers = require('../helpers');
const db = require('../db/models');
const passport = require('passport');
const JWTstrategy = require('passport-jwt').Strategy;
const ExtractJWT = require('passport-jwt').ExtractJwt;
const GoogleStrategy = require('passport-google-oauth2').Strategy;
const MicrosoftStrategy = require('passport-microsoft').Strategy;
const UsersDBApi = require('../db/api/users');
passport.use(new JWTstrategy({
passReqToCallback: true,
secretOrKey: config.secret_key,
jwtFromRequest: ExtractJWT.fromAuthHeaderAsBearerToken()
}, async (req, token, done) => {
try {
const user = await UsersDBApi.findBy( {email: token.user.email});
if (user && user.disabled) {
return done (new Error(`User '${user.email}' is disabled`));
}
req.currentUser = user;
return done(null, user);
} catch (error) {
done(error);
}
}));
passport.use(new GoogleStrategy({
clientID: config.google.clientId,
clientSecret: config.google.clientSecret,
callbackURL: config.apiUrl + '/auth/signin/google/callback',
passReqToCallback: true
},
function (request, accessToken, refreshToken, profile, done) {
socialStrategy(profile.email, profile, providers.GOOGLE, done);
}
));
passport.use(new MicrosoftStrategy({
clientID: config.microsoft.clientId,
clientSecret: config.microsoft.clientSecret,
callbackURL: config.apiUrl + '/auth/signin/microsoft/callback',
passReqToCallback: true
},
function (request, accessToken, refreshToken, profile, done) {
const email = profile._json.mail || profile._json.userPrincipalName;
socialStrategy(email, profile, providers.MICROSOFT, done);
}
));
function socialStrategy(email, profile, provider, done) {
db.users.findOrCreate({where: {email, provider}}).then(([user, created]) => {
const body = {
id: user.id,
email: user.email,
name: profile.displayName,
};
const token = helpers.jwtSign({user: body});
return done(null, {token});
});
}

79
backend/src/config.js Normal file
View File

@ -0,0 +1,79 @@
const os = require('os');
const config = {
gcloud: {
bucket: "fldemo-files",
hash: "afeefb9d49f5b7977577876b99532ac7"
},
bcrypt: {
saltRounds: 12
},
admin_pass: "dde21f88",
user_pass: "32a3b06c2977",
admin_email: "admin@flatlogic.com",
providers: {
LOCAL: 'local',
GOOGLE: 'google',
MICROSOFT: 'microsoft'
},
secret_key: process.env.SECRET_KEY || 'dde21f88-6a94-4690-9933-32a3b06c2977',
remote: '',
port: process.env.NODE_ENV === "production" ? "" : "8080",
hostUI: process.env.NODE_ENV === "production" ? "" : "http://localhost",
portUI: process.env.NODE_ENV === "production" ? "" : "3000",
portUIProd: process.env.NODE_ENV === "production" ? "" : ":3000",
swaggerUI: process.env.NODE_ENV === "production" ? "" : "http://localhost",
swaggerPort: process.env.NODE_ENV === "production" ? "" : ":8080",
google: {
clientId: process.env.GOOGLE_CLIENT_ID || '',
clientSecret: process.env.GOOGLE_CLIENT_SECRET || '',
},
microsoft: {
clientId: process.env.MS_CLIENT_ID || '',
clientSecret: process.env.MS_CLIENT_SECRET || '',
},
uploadDir: os.tmpdir(),
email: {
from: 'Gold Forecasting Engine <app@flatlogic.app>',
host: 'email-smtp.us-east-1.amazonaws.com',
port: 587,
auth: {
user: process.env.EMAIL_USER || '',
pass: process.env.EMAIL_PASS,
},
tls: {
rejectUnauthorized: false
}
},
roles: {
admin: 'Administrator',
user: 'ReadOnlyAnalyst',
},
project_uuid: 'dde21f88-6a94-4690-9933-32a3b06c2977',
flHost: process.env.NODE_ENV === 'production' || process.env.NODE_ENV === 'dev_stage' ? 'https://flatlogic.com/projects' : 'http://localhost:3000/projects',
gpt_key: process.env.GPT_KEY || '',
};
config.pexelsKey = process.env.PEXELS_KEY || '';
config.pexelsQuery = 'Mountain sunrise over calm horizon';
config.host = process.env.NODE_ENV === "production" ? config.remote : "http://localhost";
config.apiUrl = `${config.host}${config.port ? `:${config.port}` : ``}/api`;
config.swaggerUrl = `${config.swaggerUI}${config.swaggerPort}`;
config.uiUrl = `${config.hostUI}${config.portUI ? `:${config.portUI}` : ``}/#`;
config.backUrl = `${config.hostUI}${config.portUI ? `:${config.portUI}` : ``}`;
module.exports = config;

View File

@ -0,0 +1,535 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Alert_eventsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const alert_events = await db.alert_events.create(
{
id: data.id || undefined,
triggered_at: data.triggered_at
||
null
,
state: data.state
||
null
,
message: data.message
||
null
,
observed_value: data.observed_value
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await alert_events.setAlert( data.alert || null, {
transaction,
});
await alert_events.setAcknowledged_by( data.acknowledged_by || null, {
transaction,
});
return alert_events;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const alert_eventsData = data.map((item, index) => ({
id: item.id || undefined,
triggered_at: item.triggered_at
||
null
,
state: item.state
||
null
,
message: item.message
||
null
,
observed_value: item.observed_value
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const alert_events = await db.alert_events.bulkCreate(alert_eventsData, { transaction });
// For each item created, replace relation files
return alert_events;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const alert_events = await db.alert_events.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.triggered_at !== undefined) updatePayload.triggered_at = data.triggered_at;
if (data.state !== undefined) updatePayload.state = data.state;
if (data.message !== undefined) updatePayload.message = data.message;
if (data.observed_value !== undefined) updatePayload.observed_value = data.observed_value;
updatePayload.updatedById = currentUser.id;
await alert_events.update(updatePayload, {transaction});
if (data.alert !== undefined) {
await alert_events.setAlert(
data.alert,
{ transaction }
);
}
if (data.acknowledged_by !== undefined) {
await alert_events.setAcknowledged_by(
data.acknowledged_by,
{ transaction }
);
}
return alert_events;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const alert_events = await db.alert_events.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of alert_events) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of alert_events) {
await record.destroy({transaction});
}
});
return alert_events;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const alert_events = await db.alert_events.findByPk(id, options);
await alert_events.update({
deletedBy: currentUser.id
}, {
transaction,
});
await alert_events.destroy({
transaction
});
return alert_events;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const alert_events = await db.alert_events.findOne(
{ where },
{ transaction },
);
if (!alert_events) {
return alert_events;
}
const output = alert_events.get({plain: true});
output.alert = await alert_events.getAlert({
transaction
});
output.acknowledged_by = await alert_events.getAcknowledged_by({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.alerts,
as: 'alert',
where: filter.alert ? {
[Op.or]: [
{ id: { [Op.in]: filter.alert.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.alert.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.users,
as: 'acknowledged_by',
where: filter.acknowledged_by ? {
[Op.or]: [
{ id: { [Op.in]: filter.acknowledged_by.split('|').map(term => Utils.uuid(term)) } },
{
firstName: {
[Op.or]: filter.acknowledged_by.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.message) {
where = {
...where,
[Op.and]: Utils.ilike(
'alert_events',
'message',
filter.message,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
triggered_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
triggered_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.triggered_atRange) {
const [start, end] = filter.triggered_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
triggered_at: {
...where.triggered_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
triggered_at: {
...where.triggered_at,
[Op.lte]: end,
},
};
}
}
if (filter.observed_valueRange) {
const [start, end] = filter.observed_valueRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
observed_value: {
...where.observed_value,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
observed_value: {
...where.observed_value,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.state) {
where = {
...where,
state: filter.state,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.alert_events.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'alert_events',
'state',
query,
),
],
};
}
const records = await db.alert_events.findAll({
attributes: [ 'id', 'state' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['state', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.state,
}));
}
};

View File

@ -0,0 +1,607 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class AlertsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const alerts = await db.alerts.create(
{
id: data.id || undefined,
name: data.name
||
null
,
alert_type: data.alert_type
||
null
,
severity: data.severity
||
null
,
delivery_channel: data.delivery_channel
||
null
,
is_enabled: data.is_enabled
||
false
,
threshold_value: data.threshold_value
||
null
,
rule_description: data.rule_description
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await alerts.setTarget_asset( data.target_asset || null, {
transaction,
});
await alerts.setTarget_model( data.target_model || null, {
transaction,
});
await alerts.setOwner( data.owner || null, {
transaction,
});
return alerts;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const alertsData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
alert_type: item.alert_type
||
null
,
severity: item.severity
||
null
,
delivery_channel: item.delivery_channel
||
null
,
is_enabled: item.is_enabled
||
false
,
threshold_value: item.threshold_value
||
null
,
rule_description: item.rule_description
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const alerts = await db.alerts.bulkCreate(alertsData, { transaction });
// For each item created, replace relation files
return alerts;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const alerts = await db.alerts.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.alert_type !== undefined) updatePayload.alert_type = data.alert_type;
if (data.severity !== undefined) updatePayload.severity = data.severity;
if (data.delivery_channel !== undefined) updatePayload.delivery_channel = data.delivery_channel;
if (data.is_enabled !== undefined) updatePayload.is_enabled = data.is_enabled;
if (data.threshold_value !== undefined) updatePayload.threshold_value = data.threshold_value;
if (data.rule_description !== undefined) updatePayload.rule_description = data.rule_description;
updatePayload.updatedById = currentUser.id;
await alerts.update(updatePayload, {transaction});
if (data.target_asset !== undefined) {
await alerts.setTarget_asset(
data.target_asset,
{ transaction }
);
}
if (data.target_model !== undefined) {
await alerts.setTarget_model(
data.target_model,
{ transaction }
);
}
if (data.owner !== undefined) {
await alerts.setOwner(
data.owner,
{ transaction }
);
}
return alerts;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const alerts = await db.alerts.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of alerts) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of alerts) {
await record.destroy({transaction});
}
});
return alerts;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const alerts = await db.alerts.findByPk(id, options);
await alerts.update({
deletedBy: currentUser.id
}, {
transaction,
});
await alerts.destroy({
transaction
});
return alerts;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const alerts = await db.alerts.findOne(
{ where },
{ transaction },
);
if (!alerts) {
return alerts;
}
const output = alerts.get({plain: true});
output.alert_events_alert = await alerts.getAlert_events_alert({
transaction
});
output.target_asset = await alerts.getTarget_asset({
transaction
});
output.target_model = await alerts.getTarget_model({
transaction
});
output.owner = await alerts.getOwner({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.assets,
as: 'target_asset',
where: filter.target_asset ? {
[Op.or]: [
{ id: { [Op.in]: filter.target_asset.split('|').map(term => Utils.uuid(term)) } },
{
symbol: {
[Op.or]: filter.target_asset.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.models,
as: 'target_model',
where: filter.target_model ? {
[Op.or]: [
{ id: { [Op.in]: filter.target_model.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.target_model.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.users,
as: 'owner',
where: filter.owner ? {
[Op.or]: [
{ id: { [Op.in]: filter.owner.split('|').map(term => Utils.uuid(term)) } },
{
firstName: {
[Op.or]: filter.owner.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'alerts',
'name',
filter.name,
),
};
}
if (filter.rule_description) {
where = {
...where,
[Op.and]: Utils.ilike(
'alerts',
'rule_description',
filter.rule_description,
),
};
}
if (filter.threshold_valueRange) {
const [start, end] = filter.threshold_valueRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
threshold_value: {
...where.threshold_value,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
threshold_value: {
...where.threshold_value,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.alert_type) {
where = {
...where,
alert_type: filter.alert_type,
};
}
if (filter.severity) {
where = {
...where,
severity: filter.severity,
};
}
if (filter.delivery_channel) {
where = {
...where,
delivery_channel: filter.delivery_channel,
};
}
if (filter.is_enabled) {
where = {
...where,
is_enabled: filter.is_enabled,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.alerts.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'alerts',
'name',
query,
),
],
};
}
const records = await db.alerts.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,489 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Api_keysDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const api_keys = await db.api_keys.create(
{
id: data.id || undefined,
name: data.name
||
null
,
scope: data.scope
||
null
,
is_active: data.is_active
||
false
,
expires_at: data.expires_at
||
null
,
key_fingerprint: data.key_fingerprint
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await api_keys.setOwner( data.owner || null, {
transaction,
});
return api_keys;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const api_keysData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
scope: item.scope
||
null
,
is_active: item.is_active
||
false
,
expires_at: item.expires_at
||
null
,
key_fingerprint: item.key_fingerprint
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const api_keys = await db.api_keys.bulkCreate(api_keysData, { transaction });
// For each item created, replace relation files
return api_keys;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const api_keys = await db.api_keys.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.scope !== undefined) updatePayload.scope = data.scope;
if (data.is_active !== undefined) updatePayload.is_active = data.is_active;
if (data.expires_at !== undefined) updatePayload.expires_at = data.expires_at;
if (data.key_fingerprint !== undefined) updatePayload.key_fingerprint = data.key_fingerprint;
updatePayload.updatedById = currentUser.id;
await api_keys.update(updatePayload, {transaction});
if (data.owner !== undefined) {
await api_keys.setOwner(
data.owner,
{ transaction }
);
}
return api_keys;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const api_keys = await db.api_keys.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of api_keys) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of api_keys) {
await record.destroy({transaction});
}
});
return api_keys;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const api_keys = await db.api_keys.findByPk(id, options);
await api_keys.update({
deletedBy: currentUser.id
}, {
transaction,
});
await api_keys.destroy({
transaction
});
return api_keys;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const api_keys = await db.api_keys.findOne(
{ where },
{ transaction },
);
if (!api_keys) {
return api_keys;
}
const output = api_keys.get({plain: true});
output.owner = await api_keys.getOwner({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.users,
as: 'owner',
where: filter.owner ? {
[Op.or]: [
{ id: { [Op.in]: filter.owner.split('|').map(term => Utils.uuid(term)) } },
{
firstName: {
[Op.or]: filter.owner.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'api_keys',
'name',
filter.name,
),
};
}
if (filter.key_fingerprint) {
where = {
...where,
[Op.and]: Utils.ilike(
'api_keys',
'key_fingerprint',
filter.key_fingerprint,
),
};
}
if (filter.expires_atRange) {
const [start, end] = filter.expires_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
expires_at: {
...where.expires_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
expires_at: {
...where.expires_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.scope) {
where = {
...where,
scope: filter.scope,
};
}
if (filter.is_active) {
where = {
...where,
is_active: filter.is_active,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.api_keys.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'api_keys',
'name',
query,
),
],
};
}
const records = await db.api_keys.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,524 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class AssetsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const assets = await db.assets.create(
{
id: data.id || undefined,
symbol: data.symbol
||
null
,
name: data.name
||
null
,
asset_class: data.asset_class
||
null
,
currency: data.currency
||
null
,
exchange_venue: data.exchange_venue
||
null
,
is_active: data.is_active
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await assets.setPrimary_data_source( data.primary_data_source || null, {
transaction,
});
return assets;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const assetsData = data.map((item, index) => ({
id: item.id || undefined,
symbol: item.symbol
||
null
,
name: item.name
||
null
,
asset_class: item.asset_class
||
null
,
currency: item.currency
||
null
,
exchange_venue: item.exchange_venue
||
null
,
is_active: item.is_active
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const assets = await db.assets.bulkCreate(assetsData, { transaction });
// For each item created, replace relation files
return assets;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const assets = await db.assets.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.symbol !== undefined) updatePayload.symbol = data.symbol;
if (data.name !== undefined) updatePayload.name = data.name;
if (data.asset_class !== undefined) updatePayload.asset_class = data.asset_class;
if (data.currency !== undefined) updatePayload.currency = data.currency;
if (data.exchange_venue !== undefined) updatePayload.exchange_venue = data.exchange_venue;
if (data.is_active !== undefined) updatePayload.is_active = data.is_active;
updatePayload.updatedById = currentUser.id;
await assets.update(updatePayload, {transaction});
if (data.primary_data_source !== undefined) {
await assets.setPrimary_data_source(
data.primary_data_source,
{ transaction }
);
}
return assets;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const assets = await db.assets.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of assets) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of assets) {
await record.destroy({transaction});
}
});
return assets;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const assets = await db.assets.findByPk(id, options);
await assets.update({
deletedBy: currentUser.id
}, {
transaction,
});
await assets.destroy({
transaction
});
return assets;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const assets = await db.assets.findOne(
{ where },
{ transaction },
);
if (!assets) {
return assets;
}
const output = assets.get({plain: true});
output.time_series_asset = await assets.getTime_series_asset({
transaction
});
output.mining_companies_equity_asset = await assets.getMining_companies_equity_asset({
transaction
});
output.macro_indicators_series_asset = await assets.getMacro_indicators_series_asset({
transaction
});
output.forecasts_target_asset = await assets.getForecasts_target_asset({
transaction
});
output.scenario_shocks_asset = await assets.getScenario_shocks_asset({
transaction
});
output.alerts_target_asset = await assets.getAlerts_target_asset({
transaction
});
output.primary_data_source = await assets.getPrimary_data_source({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.data_sources,
as: 'primary_data_source',
where: filter.primary_data_source ? {
[Op.or]: [
{ id: { [Op.in]: filter.primary_data_source.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.primary_data_source.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.symbol) {
where = {
...where,
[Op.and]: Utils.ilike(
'assets',
'symbol',
filter.symbol,
),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'assets',
'name',
filter.name,
),
};
}
if (filter.currency) {
where = {
...where,
[Op.and]: Utils.ilike(
'assets',
'currency',
filter.currency,
),
};
}
if (filter.exchange_venue) {
where = {
...where,
[Op.and]: Utils.ilike(
'assets',
'exchange_venue',
filter.exchange_venue,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.asset_class) {
where = {
...where,
asset_class: filter.asset_class,
};
}
if (filter.is_active) {
where = {
...where,
is_active: filter.is_active,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.assets.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'assets',
'symbol',
query,
),
],
};
}
const records = await db.assets.findAll({
attributes: [ 'id', 'symbol' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['symbol', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.symbol,
}));
}
};

View File

@ -0,0 +1,553 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Audit_eventsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const audit_events = await db.audit_events.create(
{
id: data.id || undefined,
occurred_at: data.occurred_at
||
null
,
event_type: data.event_type
||
null
,
outcome: data.outcome
||
null
,
resource_type: data.resource_type
||
null
,
resource_identifier: data.resource_identifier
||
null
,
details: data.details
||
null
,
ip_address: data.ip_address
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await audit_events.setActor( data.actor || null, {
transaction,
});
return audit_events;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const audit_eventsData = data.map((item, index) => ({
id: item.id || undefined,
occurred_at: item.occurred_at
||
null
,
event_type: item.event_type
||
null
,
outcome: item.outcome
||
null
,
resource_type: item.resource_type
||
null
,
resource_identifier: item.resource_identifier
||
null
,
details: item.details
||
null
,
ip_address: item.ip_address
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const audit_events = await db.audit_events.bulkCreate(audit_eventsData, { transaction });
// For each item created, replace relation files
return audit_events;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const audit_events = await db.audit_events.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.occurred_at !== undefined) updatePayload.occurred_at = data.occurred_at;
if (data.event_type !== undefined) updatePayload.event_type = data.event_type;
if (data.outcome !== undefined) updatePayload.outcome = data.outcome;
if (data.resource_type !== undefined) updatePayload.resource_type = data.resource_type;
if (data.resource_identifier !== undefined) updatePayload.resource_identifier = data.resource_identifier;
if (data.details !== undefined) updatePayload.details = data.details;
if (data.ip_address !== undefined) updatePayload.ip_address = data.ip_address;
updatePayload.updatedById = currentUser.id;
await audit_events.update(updatePayload, {transaction});
if (data.actor !== undefined) {
await audit_events.setActor(
data.actor,
{ transaction }
);
}
return audit_events;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const audit_events = await db.audit_events.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of audit_events) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of audit_events) {
await record.destroy({transaction});
}
});
return audit_events;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const audit_events = await db.audit_events.findByPk(id, options);
await audit_events.update({
deletedBy: currentUser.id
}, {
transaction,
});
await audit_events.destroy({
transaction
});
return audit_events;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const audit_events = await db.audit_events.findOne(
{ where },
{ transaction },
);
if (!audit_events) {
return audit_events;
}
const output = audit_events.get({plain: true});
output.actor = await audit_events.getActor({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.users,
as: 'actor',
where: filter.actor ? {
[Op.or]: [
{ id: { [Op.in]: filter.actor.split('|').map(term => Utils.uuid(term)) } },
{
firstName: {
[Op.or]: filter.actor.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.resource_type) {
where = {
...where,
[Op.and]: Utils.ilike(
'audit_events',
'resource_type',
filter.resource_type,
),
};
}
if (filter.resource_identifier) {
where = {
...where,
[Op.and]: Utils.ilike(
'audit_events',
'resource_identifier',
filter.resource_identifier,
),
};
}
if (filter.details) {
where = {
...where,
[Op.and]: Utils.ilike(
'audit_events',
'details',
filter.details,
),
};
}
if (filter.ip_address) {
where = {
...where,
[Op.and]: Utils.ilike(
'audit_events',
'ip_address',
filter.ip_address,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
occurred_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
occurred_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.occurred_atRange) {
const [start, end] = filter.occurred_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
occurred_at: {
...where.occurred_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
occurred_at: {
...where.occurred_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.event_type) {
where = {
...where,
event_type: filter.event_type,
};
}
if (filter.outcome) {
where = {
...where,
outcome: filter.outcome,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.audit_events.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'audit_events',
'event_type',
query,
),
],
};
}
const records = await db.audit_events.findAll({
attributes: [ 'id', 'event_type' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['event_type', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.event_type,
}));
}
};

View File

@ -0,0 +1,593 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Data_sourcesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const data_sources = await db.data_sources.create(
{
id: data.id || undefined,
name: data.name
||
null
,
category: data.category
||
null
,
ingestion_mode: data.ingestion_mode
||
null
,
connection_type: data.connection_type
||
null
,
coverage_description: data.coverage_description
||
null
,
refresh_rate: data.refresh_rate
||
null
,
is_enabled: data.is_enabled
||
false
,
last_success_at: data.last_success_at
||
null
,
last_failure_at: data.last_failure_at
||
null
,
last_failure_reason: data.last_failure_reason
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
return data_sources;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const data_sourcesData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
category: item.category
||
null
,
ingestion_mode: item.ingestion_mode
||
null
,
connection_type: item.connection_type
||
null
,
coverage_description: item.coverage_description
||
null
,
refresh_rate: item.refresh_rate
||
null
,
is_enabled: item.is_enabled
||
false
,
last_success_at: item.last_success_at
||
null
,
last_failure_at: item.last_failure_at
||
null
,
last_failure_reason: item.last_failure_reason
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const data_sources = await db.data_sources.bulkCreate(data_sourcesData, { transaction });
// For each item created, replace relation files
return data_sources;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const data_sources = await db.data_sources.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.category !== undefined) updatePayload.category = data.category;
if (data.ingestion_mode !== undefined) updatePayload.ingestion_mode = data.ingestion_mode;
if (data.connection_type !== undefined) updatePayload.connection_type = data.connection_type;
if (data.coverage_description !== undefined) updatePayload.coverage_description = data.coverage_description;
if (data.refresh_rate !== undefined) updatePayload.refresh_rate = data.refresh_rate;
if (data.is_enabled !== undefined) updatePayload.is_enabled = data.is_enabled;
if (data.last_success_at !== undefined) updatePayload.last_success_at = data.last_success_at;
if (data.last_failure_at !== undefined) updatePayload.last_failure_at = data.last_failure_at;
if (data.last_failure_reason !== undefined) updatePayload.last_failure_reason = data.last_failure_reason;
updatePayload.updatedById = currentUser.id;
await data_sources.update(updatePayload, {transaction});
return data_sources;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const data_sources = await db.data_sources.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of data_sources) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of data_sources) {
await record.destroy({transaction});
}
});
return data_sources;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const data_sources = await db.data_sources.findByPk(id, options);
await data_sources.update({
deletedBy: currentUser.id
}, {
transaction,
});
await data_sources.destroy({
transaction
});
return data_sources;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const data_sources = await db.data_sources.findOne(
{ where },
{ transaction },
);
if (!data_sources) {
return data_sources;
}
const output = data_sources.get({plain: true});
output.assets_primary_data_source = await data_sources.getAssets_primary_data_source({
transaction
});
output.macro_indicators_data_source = await data_sources.getMacro_indicators_data_source({
transaction
});
output.geopolitical_events_data_source = await data_sources.getGeopolitical_events_data_source({
transaction
});
output.geopolitical_scores_data_source = await data_sources.getGeopolitical_scores_data_source({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'data_sources',
'name',
filter.name,
),
};
}
if (filter.connection_type) {
where = {
...where,
[Op.and]: Utils.ilike(
'data_sources',
'connection_type',
filter.connection_type,
),
};
}
if (filter.coverage_description) {
where = {
...where,
[Op.and]: Utils.ilike(
'data_sources',
'coverage_description',
filter.coverage_description,
),
};
}
if (filter.last_failure_reason) {
where = {
...where,
[Op.and]: Utils.ilike(
'data_sources',
'last_failure_reason',
filter.last_failure_reason,
),
};
}
if (filter.last_success_atRange) {
const [start, end] = filter.last_success_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
last_success_at: {
...where.last_success_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
last_success_at: {
...where.last_success_at,
[Op.lte]: end,
},
};
}
}
if (filter.last_failure_atRange) {
const [start, end] = filter.last_failure_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
last_failure_at: {
...where.last_failure_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
last_failure_at: {
...where.last_failure_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.category) {
where = {
...where,
category: filter.category,
};
}
if (filter.ingestion_mode) {
where = {
...where,
ingestion_mode: filter.ingestion_mode,
};
}
if (filter.refresh_rate) {
where = {
...where,
refresh_rate: filter.refresh_rate,
};
}
if (filter.is_enabled) {
where = {
...where,
is_enabled: filter.is_enabled,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.data_sources.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'data_sources',
'name',
query,
),
],
};
}
const records = await db.data_sources.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,504 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Factor_attributionsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const factor_attributions = await db.factor_attributions.create(
{
id: data.id || undefined,
factor_name: data.factor_name
||
null
,
factor_category: data.factor_category
||
null
,
contribution: data.contribution
||
null
,
importance: data.importance
||
null
,
notes: data.notes
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await factor_attributions.setForecast( data.forecast || null, {
transaction,
});
return factor_attributions;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const factor_attributionsData = data.map((item, index) => ({
id: item.id || undefined,
factor_name: item.factor_name
||
null
,
factor_category: item.factor_category
||
null
,
contribution: item.contribution
||
null
,
importance: item.importance
||
null
,
notes: item.notes
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const factor_attributions = await db.factor_attributions.bulkCreate(factor_attributionsData, { transaction });
// For each item created, replace relation files
return factor_attributions;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const factor_attributions = await db.factor_attributions.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.factor_name !== undefined) updatePayload.factor_name = data.factor_name;
if (data.factor_category !== undefined) updatePayload.factor_category = data.factor_category;
if (data.contribution !== undefined) updatePayload.contribution = data.contribution;
if (data.importance !== undefined) updatePayload.importance = data.importance;
if (data.notes !== undefined) updatePayload.notes = data.notes;
updatePayload.updatedById = currentUser.id;
await factor_attributions.update(updatePayload, {transaction});
if (data.forecast !== undefined) {
await factor_attributions.setForecast(
data.forecast,
{ transaction }
);
}
return factor_attributions;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const factor_attributions = await db.factor_attributions.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of factor_attributions) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of factor_attributions) {
await record.destroy({transaction});
}
});
return factor_attributions;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const factor_attributions = await db.factor_attributions.findByPk(id, options);
await factor_attributions.update({
deletedBy: currentUser.id
}, {
transaction,
});
await factor_attributions.destroy({
transaction
});
return factor_attributions;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const factor_attributions = await db.factor_attributions.findOne(
{ where },
{ transaction },
);
if (!factor_attributions) {
return factor_attributions;
}
const output = factor_attributions.get({plain: true});
output.forecast = await factor_attributions.getForecast({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.forecasts,
as: 'forecast',
where: filter.forecast ? {
[Op.or]: [
{ id: { [Op.in]: filter.forecast.split('|').map(term => Utils.uuid(term)) } },
{
horizon: {
[Op.or]: filter.forecast.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.factor_name) {
where = {
...where,
[Op.and]: Utils.ilike(
'factor_attributions',
'factor_name',
filter.factor_name,
),
};
}
if (filter.notes) {
where = {
...where,
[Op.and]: Utils.ilike(
'factor_attributions',
'notes',
filter.notes,
),
};
}
if (filter.contributionRange) {
const [start, end] = filter.contributionRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
contribution: {
...where.contribution,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
contribution: {
...where.contribution,
[Op.lte]: end,
},
};
}
}
if (filter.importanceRange) {
const [start, end] = filter.importanceRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
importance: {
...where.importance,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
importance: {
...where.importance,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.factor_category) {
where = {
...where,
factor_category: filter.factor_category,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.factor_attributions.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'factor_attributions',
'factor_name',
query,
),
],
};
}
const records = await db.factor_attributions.findAll({
attributes: [ 'id', 'factor_name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['factor_name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.factor_name,
}));
}
};

View File

@ -0,0 +1,555 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Feature_setsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const feature_sets = await db.feature_sets.create(
{
id: data.id || undefined,
name: data.name
||
null
,
description: data.description
||
null
,
version_stage: data.version_stage
||
null
,
effective_from_at: data.effective_from_at
||
null
,
effective_to_at: data.effective_to_at
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await feature_sets.setInput_assets(data.input_assets || [], {
transaction,
});
await feature_sets.setMacro_indicators(data.macro_indicators || [], {
transaction,
});
return feature_sets;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const feature_setsData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
description: item.description
||
null
,
version_stage: item.version_stage
||
null
,
effective_from_at: item.effective_from_at
||
null
,
effective_to_at: item.effective_to_at
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const feature_sets = await db.feature_sets.bulkCreate(feature_setsData, { transaction });
// For each item created, replace relation files
return feature_sets;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const feature_sets = await db.feature_sets.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.description !== undefined) updatePayload.description = data.description;
if (data.version_stage !== undefined) updatePayload.version_stage = data.version_stage;
if (data.effective_from_at !== undefined) updatePayload.effective_from_at = data.effective_from_at;
if (data.effective_to_at !== undefined) updatePayload.effective_to_at = data.effective_to_at;
updatePayload.updatedById = currentUser.id;
await feature_sets.update(updatePayload, {transaction});
if (data.input_assets !== undefined) {
await feature_sets.setInput_assets(data.input_assets, { transaction });
}
if (data.macro_indicators !== undefined) {
await feature_sets.setMacro_indicators(data.macro_indicators, { transaction });
}
return feature_sets;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const feature_sets = await db.feature_sets.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of feature_sets) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of feature_sets) {
await record.destroy({transaction});
}
});
return feature_sets;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const feature_sets = await db.feature_sets.findByPk(id, options);
await feature_sets.update({
deletedBy: currentUser.id
}, {
transaction,
});
await feature_sets.destroy({
transaction
});
return feature_sets;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const feature_sets = await db.feature_sets.findOne(
{ where },
{ transaction },
);
if (!feature_sets) {
return feature_sets;
}
const output = feature_sets.get({plain: true});
output.models_feature_set = await feature_sets.getModels_feature_set({
transaction
});
output.input_assets = await feature_sets.getInput_assets({
transaction
});
output.macro_indicators = await feature_sets.getMacro_indicators({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.assets,
as: 'input_assets',
required: false,
},
{
model: db.macro_indicators,
as: 'macro_indicators',
required: false,
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'feature_sets',
'name',
filter.name,
),
};
}
if (filter.description) {
where = {
...where,
[Op.and]: Utils.ilike(
'feature_sets',
'description',
filter.description,
),
};
}
if (filter.effective_from_atRange) {
const [start, end] = filter.effective_from_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
effective_from_at: {
...where.effective_from_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
effective_from_at: {
...where.effective_from_at,
[Op.lte]: end,
},
};
}
}
if (filter.effective_to_atRange) {
const [start, end] = filter.effective_to_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
effective_to_at: {
...where.effective_to_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
effective_to_at: {
...where.effective_to_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.version_stage) {
where = {
...where,
version_stage: filter.version_stage,
};
}
if (filter.input_assets) {
const searchTerms = filter.input_assets.split('|');
include = [
{
model: db.assets,
as: 'input_assets_filter',
required: searchTerms.length > 0,
where: searchTerms.length > 0 ? {
[Op.or]: [
{ id: { [Op.in]: searchTerms.map(term => Utils.uuid(term)) } },
{
symbol: {
[Op.or]: searchTerms.map(term => ({ [Op.iLike]: `%${term}%` }))
}
}
]
} : undefined
},
...include,
]
}
if (filter.macro_indicators) {
const searchTerms = filter.macro_indicators.split('|');
include = [
{
model: db.macro_indicators,
as: 'macro_indicators_filter',
required: searchTerms.length > 0,
where: searchTerms.length > 0 ? {
[Op.or]: [
{ id: { [Op.in]: searchTerms.map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: searchTerms.map(term => ({ [Op.iLike]: `%${term}%` }))
}
}
]
} : undefined
},
...include,
]
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.feature_sets.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'feature_sets',
'name',
query,
),
],
};
}
const records = await db.feature_sets.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,87 @@
const db = require('../models');
const assert = require('assert');
const services = require('../../services/file');
module.exports = class FileDBApi {
static async replaceRelationFiles(
relation,
rawFiles,
options,
) {
assert(relation.belongsTo, 'belongsTo is required');
assert(
relation.belongsToColumn,
'belongsToColumn is required',
);
assert(relation.belongsToId, 'belongsToId is required');
let files = [];
if (Array.isArray(rawFiles)) {
files = rawFiles;
} else {
files = rawFiles ? [rawFiles] : [];
}
await this._removeLegacyFiles(relation, files, options);
await this._addFiles(relation, files, options);
}
static async _addFiles(relation, files, options) {
const transaction = (options && options.transaction) || undefined;
const currentUser = (options && options.currentUser) || {id: null};
const inexistentFiles = files.filter(
(file) => !!file.new,
);
for (const file of inexistentFiles) {
await db.file.create(
{
belongsTo: relation.belongsTo,
belongsToColumn: relation.belongsToColumn,
belongsToId: relation.belongsToId,
name: file.name,
sizeInBytes: file.sizeInBytes,
privateUrl: file.privateUrl,
publicUrl: file.publicUrl,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{
transaction,
},
);
}
}
static async _removeLegacyFiles(
relation,
files,
options,
) {
const transaction = (options && options.transaction) || undefined;
const filesToDelete = await db.file.findAll({
where: {
belongsTo: relation.belongsTo,
belongsToId: relation.belongsToId,
belongsToColumn: relation.belongsToColumn,
id: {
[db.Sequelize.Op
.notIn]: files
.filter((file) => !file.new)
.map((file) => file.id)
},
},
transaction,
});
for (let file of filesToDelete) {
await services.deleteGCloud(file.privateUrl);
await file.destroy({
transaction,
});
}
}
};

View File

@ -0,0 +1,785 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class ForecastsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const forecasts = await db.forecasts.create(
{
id: data.id || undefined,
as_of_at: data.as_of_at
||
null
,
horizon: data.horizon
||
null
,
target_time_at: data.target_time_at
||
null
,
point_estimate: data.point_estimate
||
null
,
p10: data.p10
||
null
,
p50: data.p50
||
null
,
p90: data.p90
||
null
,
volatility_forecast: data.volatility_forecast
||
null
,
signal_direction: data.signal_direction
||
null
,
signal_confidence: data.signal_confidence
||
null
,
explainability_summary: data.explainability_summary
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await forecasts.setModel( data.model || null, {
transaction,
});
await forecasts.setTarget_asset( data.target_asset || null, {
transaction,
});
return forecasts;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const forecastsData = data.map((item, index) => ({
id: item.id || undefined,
as_of_at: item.as_of_at
||
null
,
horizon: item.horizon
||
null
,
target_time_at: item.target_time_at
||
null
,
point_estimate: item.point_estimate
||
null
,
p10: item.p10
||
null
,
p50: item.p50
||
null
,
p90: item.p90
||
null
,
volatility_forecast: item.volatility_forecast
||
null
,
signal_direction: item.signal_direction
||
null
,
signal_confidence: item.signal_confidence
||
null
,
explainability_summary: item.explainability_summary
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const forecasts = await db.forecasts.bulkCreate(forecastsData, { transaction });
// For each item created, replace relation files
return forecasts;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const forecasts = await db.forecasts.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.as_of_at !== undefined) updatePayload.as_of_at = data.as_of_at;
if (data.horizon !== undefined) updatePayload.horizon = data.horizon;
if (data.target_time_at !== undefined) updatePayload.target_time_at = data.target_time_at;
if (data.point_estimate !== undefined) updatePayload.point_estimate = data.point_estimate;
if (data.p10 !== undefined) updatePayload.p10 = data.p10;
if (data.p50 !== undefined) updatePayload.p50 = data.p50;
if (data.p90 !== undefined) updatePayload.p90 = data.p90;
if (data.volatility_forecast !== undefined) updatePayload.volatility_forecast = data.volatility_forecast;
if (data.signal_direction !== undefined) updatePayload.signal_direction = data.signal_direction;
if (data.signal_confidence !== undefined) updatePayload.signal_confidence = data.signal_confidence;
if (data.explainability_summary !== undefined) updatePayload.explainability_summary = data.explainability_summary;
updatePayload.updatedById = currentUser.id;
await forecasts.update(updatePayload, {transaction});
if (data.model !== undefined) {
await forecasts.setModel(
data.model,
{ transaction }
);
}
if (data.target_asset !== undefined) {
await forecasts.setTarget_asset(
data.target_asset,
{ transaction }
);
}
return forecasts;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const forecasts = await db.forecasts.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of forecasts) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of forecasts) {
await record.destroy({transaction});
}
});
return forecasts;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const forecasts = await db.forecasts.findByPk(id, options);
await forecasts.update({
deletedBy: currentUser.id
}, {
transaction,
});
await forecasts.destroy({
transaction
});
return forecasts;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const forecasts = await db.forecasts.findOne(
{ where },
{ transaction },
);
if (!forecasts) {
return forecasts;
}
const output = forecasts.get({plain: true});
output.factor_attributions_forecast = await forecasts.getFactor_attributions_forecast({
transaction
});
output.scenario_results_forecast = await forecasts.getScenario_results_forecast({
transaction
});
output.model = await forecasts.getModel({
transaction
});
output.target_asset = await forecasts.getTarget_asset({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.models,
as: 'model',
where: filter.model ? {
[Op.or]: [
{ id: { [Op.in]: filter.model.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.model.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.assets,
as: 'target_asset',
where: filter.target_asset ? {
[Op.or]: [
{ id: { [Op.in]: filter.target_asset.split('|').map(term => Utils.uuid(term)) } },
{
symbol: {
[Op.or]: filter.target_asset.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.explainability_summary) {
where = {
...where,
[Op.and]: Utils.ilike(
'forecasts',
'explainability_summary',
filter.explainability_summary,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
as_of_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
target_time_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.as_of_atRange) {
const [start, end] = filter.as_of_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
as_of_at: {
...where.as_of_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
as_of_at: {
...where.as_of_at,
[Op.lte]: end,
},
};
}
}
if (filter.target_time_atRange) {
const [start, end] = filter.target_time_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
target_time_at: {
...where.target_time_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
target_time_at: {
...where.target_time_at,
[Op.lte]: end,
},
};
}
}
if (filter.point_estimateRange) {
const [start, end] = filter.point_estimateRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
point_estimate: {
...where.point_estimate,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
point_estimate: {
...where.point_estimate,
[Op.lte]: end,
},
};
}
}
if (filter.p10Range) {
const [start, end] = filter.p10Range;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
p10: {
...where.p10,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
p10: {
...where.p10,
[Op.lte]: end,
},
};
}
}
if (filter.p50Range) {
const [start, end] = filter.p50Range;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
p50: {
...where.p50,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
p50: {
...where.p50,
[Op.lte]: end,
},
};
}
}
if (filter.p90Range) {
const [start, end] = filter.p90Range;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
p90: {
...where.p90,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
p90: {
...where.p90,
[Op.lte]: end,
},
};
}
}
if (filter.volatility_forecastRange) {
const [start, end] = filter.volatility_forecastRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
volatility_forecast: {
...where.volatility_forecast,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
volatility_forecast: {
...where.volatility_forecast,
[Op.lte]: end,
},
};
}
}
if (filter.signal_confidenceRange) {
const [start, end] = filter.signal_confidenceRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
signal_confidence: {
...where.signal_confidence,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
signal_confidence: {
...where.signal_confidence,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.horizon) {
where = {
...where,
horizon: filter.horizon,
};
}
if (filter.signal_direction) {
where = {
...where,
signal_direction: filter.signal_direction,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.forecasts.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'forecasts',
'horizon',
query,
),
],
};
}
const records = await db.forecasts.findAll({
attributes: [ 'id', 'horizon' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['horizon', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.horizon,
}));
}
};

View File

@ -0,0 +1,627 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Geopolitical_eventsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const geopolitical_events = await db.geopolitical_events.create(
{
id: data.id || undefined,
title: data.title
||
null
,
focus_area: data.focus_area
||
null
,
event_type: data.event_type
||
null
,
severity: data.severity
||
null
,
event_start_at: data.event_start_at
||
null
,
event_end_at: data.event_end_at
||
null
,
summary: data.summary
||
null
,
source_summary: data.source_summary
||
null
,
confidence_score: data.confidence_score
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await geopolitical_events.setData_source( data.data_source || null, {
transaction,
});
return geopolitical_events;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const geopolitical_eventsData = data.map((item, index) => ({
id: item.id || undefined,
title: item.title
||
null
,
focus_area: item.focus_area
||
null
,
event_type: item.event_type
||
null
,
severity: item.severity
||
null
,
event_start_at: item.event_start_at
||
null
,
event_end_at: item.event_end_at
||
null
,
summary: item.summary
||
null
,
source_summary: item.source_summary
||
null
,
confidence_score: item.confidence_score
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const geopolitical_events = await db.geopolitical_events.bulkCreate(geopolitical_eventsData, { transaction });
// For each item created, replace relation files
return geopolitical_events;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const geopolitical_events = await db.geopolitical_events.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.title !== undefined) updatePayload.title = data.title;
if (data.focus_area !== undefined) updatePayload.focus_area = data.focus_area;
if (data.event_type !== undefined) updatePayload.event_type = data.event_type;
if (data.severity !== undefined) updatePayload.severity = data.severity;
if (data.event_start_at !== undefined) updatePayload.event_start_at = data.event_start_at;
if (data.event_end_at !== undefined) updatePayload.event_end_at = data.event_end_at;
if (data.summary !== undefined) updatePayload.summary = data.summary;
if (data.source_summary !== undefined) updatePayload.source_summary = data.source_summary;
if (data.confidence_score !== undefined) updatePayload.confidence_score = data.confidence_score;
updatePayload.updatedById = currentUser.id;
await geopolitical_events.update(updatePayload, {transaction});
if (data.data_source !== undefined) {
await geopolitical_events.setData_source(
data.data_source,
{ transaction }
);
}
return geopolitical_events;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const geopolitical_events = await db.geopolitical_events.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of geopolitical_events) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of geopolitical_events) {
await record.destroy({transaction});
}
});
return geopolitical_events;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const geopolitical_events = await db.geopolitical_events.findByPk(id, options);
await geopolitical_events.update({
deletedBy: currentUser.id
}, {
transaction,
});
await geopolitical_events.destroy({
transaction
});
return geopolitical_events;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const geopolitical_events = await db.geopolitical_events.findOne(
{ where },
{ transaction },
);
if (!geopolitical_events) {
return geopolitical_events;
}
const output = geopolitical_events.get({plain: true});
output.geopolitical_scores_related_event = await geopolitical_events.getGeopolitical_scores_related_event({
transaction
});
output.data_source = await geopolitical_events.getData_source({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.data_sources,
as: 'data_source',
where: filter.data_source ? {
[Op.or]: [
{ id: { [Op.in]: filter.data_source.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.data_source.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.title) {
where = {
...where,
[Op.and]: Utils.ilike(
'geopolitical_events',
'title',
filter.title,
),
};
}
if (filter.summary) {
where = {
...where,
[Op.and]: Utils.ilike(
'geopolitical_events',
'summary',
filter.summary,
),
};
}
if (filter.source_summary) {
where = {
...where,
[Op.and]: Utils.ilike(
'geopolitical_events',
'source_summary',
filter.source_summary,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
event_start_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
event_end_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.event_start_atRange) {
const [start, end] = filter.event_start_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
event_start_at: {
...where.event_start_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
event_start_at: {
...where.event_start_at,
[Op.lte]: end,
},
};
}
}
if (filter.event_end_atRange) {
const [start, end] = filter.event_end_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
event_end_at: {
...where.event_end_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
event_end_at: {
...where.event_end_at,
[Op.lte]: end,
},
};
}
}
if (filter.confidence_scoreRange) {
const [start, end] = filter.confidence_scoreRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
confidence_score: {
...where.confidence_score,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
confidence_score: {
...where.confidence_score,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.focus_area) {
where = {
...where,
focus_area: filter.focus_area,
};
}
if (filter.event_type) {
where = {
...where,
event_type: filter.event_type,
};
}
if (filter.severity) {
where = {
...where,
severity: filter.severity,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.geopolitical_events.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'geopolitical_events',
'title',
query,
),
],
};
}
const records = await db.geopolitical_events.findAll({
attributes: [ 'id', 'title' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['title', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.title,
}));
}
};

View File

@ -0,0 +1,574 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Geopolitical_scoresDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const geopolitical_scores = await db.geopolitical_scores.create(
{
id: data.id || undefined,
as_of_at: data.as_of_at
||
null
,
score_type: data.score_type
||
null
,
horizon: data.horizon
||
null
,
score_value: data.score_value
||
null
,
iran_conflict_weight: data.iran_conflict_weight
||
null
,
methodology_note: data.methodology_note
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await geopolitical_scores.setRelated_event( data.related_event || null, {
transaction,
});
await geopolitical_scores.setData_source( data.data_source || null, {
transaction,
});
return geopolitical_scores;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const geopolitical_scoresData = data.map((item, index) => ({
id: item.id || undefined,
as_of_at: item.as_of_at
||
null
,
score_type: item.score_type
||
null
,
horizon: item.horizon
||
null
,
score_value: item.score_value
||
null
,
iran_conflict_weight: item.iran_conflict_weight
||
null
,
methodology_note: item.methodology_note
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const geopolitical_scores = await db.geopolitical_scores.bulkCreate(geopolitical_scoresData, { transaction });
// For each item created, replace relation files
return geopolitical_scores;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const geopolitical_scores = await db.geopolitical_scores.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.as_of_at !== undefined) updatePayload.as_of_at = data.as_of_at;
if (data.score_type !== undefined) updatePayload.score_type = data.score_type;
if (data.horizon !== undefined) updatePayload.horizon = data.horizon;
if (data.score_value !== undefined) updatePayload.score_value = data.score_value;
if (data.iran_conflict_weight !== undefined) updatePayload.iran_conflict_weight = data.iran_conflict_weight;
if (data.methodology_note !== undefined) updatePayload.methodology_note = data.methodology_note;
updatePayload.updatedById = currentUser.id;
await geopolitical_scores.update(updatePayload, {transaction});
if (data.related_event !== undefined) {
await geopolitical_scores.setRelated_event(
data.related_event,
{ transaction }
);
}
if (data.data_source !== undefined) {
await geopolitical_scores.setData_source(
data.data_source,
{ transaction }
);
}
return geopolitical_scores;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const geopolitical_scores = await db.geopolitical_scores.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of geopolitical_scores) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of geopolitical_scores) {
await record.destroy({transaction});
}
});
return geopolitical_scores;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const geopolitical_scores = await db.geopolitical_scores.findByPk(id, options);
await geopolitical_scores.update({
deletedBy: currentUser.id
}, {
transaction,
});
await geopolitical_scores.destroy({
transaction
});
return geopolitical_scores;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const geopolitical_scores = await db.geopolitical_scores.findOne(
{ where },
{ transaction },
);
if (!geopolitical_scores) {
return geopolitical_scores;
}
const output = geopolitical_scores.get({plain: true});
output.related_event = await geopolitical_scores.getRelated_event({
transaction
});
output.data_source = await geopolitical_scores.getData_source({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.geopolitical_events,
as: 'related_event',
where: filter.related_event ? {
[Op.or]: [
{ id: { [Op.in]: filter.related_event.split('|').map(term => Utils.uuid(term)) } },
{
title: {
[Op.or]: filter.related_event.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.data_sources,
as: 'data_source',
where: filter.data_source ? {
[Op.or]: [
{ id: { [Op.in]: filter.data_source.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.data_source.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.methodology_note) {
where = {
...where,
[Op.and]: Utils.ilike(
'geopolitical_scores',
'methodology_note',
filter.methodology_note,
),
};
}
if (filter.as_of_atRange) {
const [start, end] = filter.as_of_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
as_of_at: {
...where.as_of_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
as_of_at: {
...where.as_of_at,
[Op.lte]: end,
},
};
}
}
if (filter.score_valueRange) {
const [start, end] = filter.score_valueRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
score_value: {
...where.score_value,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
score_value: {
...where.score_value,
[Op.lte]: end,
},
};
}
}
if (filter.iran_conflict_weightRange) {
const [start, end] = filter.iran_conflict_weightRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
iran_conflict_weight: {
...where.iran_conflict_weight,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
iran_conflict_weight: {
...where.iran_conflict_weight,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.score_type) {
where = {
...where,
score_type: filter.score_type,
};
}
if (filter.horizon) {
where = {
...where,
horizon: filter.horizon,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.geopolitical_scores.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'geopolitical_scores',
'score_type',
query,
),
],
};
}
const records = await db.geopolitical_scores.findAll({
attributes: [ 'id', 'score_type' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['score_type', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.score_type,
}));
}
};

View File

@ -0,0 +1,557 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Macro_indicatorsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const macro_indicators = await db.macro_indicators.create(
{
id: data.id || undefined,
code: data.code
||
null
,
name: data.name
||
null
,
indicator_type: data.indicator_type
||
null
,
frequency: data.frequency
||
null
,
unit: data.unit
||
null
,
region: data.region
||
null
,
is_active: data.is_active
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await macro_indicators.setData_source( data.data_source || null, {
transaction,
});
await macro_indicators.setSeries_asset( data.series_asset || null, {
transaction,
});
return macro_indicators;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const macro_indicatorsData = data.map((item, index) => ({
id: item.id || undefined,
code: item.code
||
null
,
name: item.name
||
null
,
indicator_type: item.indicator_type
||
null
,
frequency: item.frequency
||
null
,
unit: item.unit
||
null
,
region: item.region
||
null
,
is_active: item.is_active
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const macro_indicators = await db.macro_indicators.bulkCreate(macro_indicatorsData, { transaction });
// For each item created, replace relation files
return macro_indicators;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const macro_indicators = await db.macro_indicators.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.code !== undefined) updatePayload.code = data.code;
if (data.name !== undefined) updatePayload.name = data.name;
if (data.indicator_type !== undefined) updatePayload.indicator_type = data.indicator_type;
if (data.frequency !== undefined) updatePayload.frequency = data.frequency;
if (data.unit !== undefined) updatePayload.unit = data.unit;
if (data.region !== undefined) updatePayload.region = data.region;
if (data.is_active !== undefined) updatePayload.is_active = data.is_active;
updatePayload.updatedById = currentUser.id;
await macro_indicators.update(updatePayload, {transaction});
if (data.data_source !== undefined) {
await macro_indicators.setData_source(
data.data_source,
{ transaction }
);
}
if (data.series_asset !== undefined) {
await macro_indicators.setSeries_asset(
data.series_asset,
{ transaction }
);
}
return macro_indicators;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const macro_indicators = await db.macro_indicators.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of macro_indicators) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of macro_indicators) {
await record.destroy({transaction});
}
});
return macro_indicators;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const macro_indicators = await db.macro_indicators.findByPk(id, options);
await macro_indicators.update({
deletedBy: currentUser.id
}, {
transaction,
});
await macro_indicators.destroy({
transaction
});
return macro_indicators;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const macro_indicators = await db.macro_indicators.findOne(
{ where },
{ transaction },
);
if (!macro_indicators) {
return macro_indicators;
}
const output = macro_indicators.get({plain: true});
output.data_source = await macro_indicators.getData_source({
transaction
});
output.series_asset = await macro_indicators.getSeries_asset({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.data_sources,
as: 'data_source',
where: filter.data_source ? {
[Op.or]: [
{ id: { [Op.in]: filter.data_source.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.data_source.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.assets,
as: 'series_asset',
where: filter.series_asset ? {
[Op.or]: [
{ id: { [Op.in]: filter.series_asset.split('|').map(term => Utils.uuid(term)) } },
{
symbol: {
[Op.or]: filter.series_asset.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.code) {
where = {
...where,
[Op.and]: Utils.ilike(
'macro_indicators',
'code',
filter.code,
),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'macro_indicators',
'name',
filter.name,
),
};
}
if (filter.unit) {
where = {
...where,
[Op.and]: Utils.ilike(
'macro_indicators',
'unit',
filter.unit,
),
};
}
if (filter.region) {
where = {
...where,
[Op.and]: Utils.ilike(
'macro_indicators',
'region',
filter.region,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.indicator_type) {
where = {
...where,
indicator_type: filter.indicator_type,
};
}
if (filter.frequency) {
where = {
...where,
frequency: filter.frequency,
};
}
if (filter.is_active) {
where = {
...where,
is_active: filter.is_active,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.macro_indicators.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'macro_indicators',
'name',
query,
),
],
};
}
const records = await db.macro_indicators.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,484 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Mining_companiesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const mining_companies = await db.mining_companies.create(
{
id: data.id || undefined,
ticker: data.ticker
||
null
,
company_name: data.company_name
||
null
,
country: data.country
||
null
,
primary_mines: data.primary_mines
||
null
,
is_major_producer: data.is_major_producer
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await mining_companies.setEquity_asset( data.equity_asset || null, {
transaction,
});
return mining_companies;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const mining_companiesData = data.map((item, index) => ({
id: item.id || undefined,
ticker: item.ticker
||
null
,
company_name: item.company_name
||
null
,
country: item.country
||
null
,
primary_mines: item.primary_mines
||
null
,
is_major_producer: item.is_major_producer
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const mining_companies = await db.mining_companies.bulkCreate(mining_companiesData, { transaction });
// For each item created, replace relation files
return mining_companies;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const mining_companies = await db.mining_companies.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.ticker !== undefined) updatePayload.ticker = data.ticker;
if (data.company_name !== undefined) updatePayload.company_name = data.company_name;
if (data.country !== undefined) updatePayload.country = data.country;
if (data.primary_mines !== undefined) updatePayload.primary_mines = data.primary_mines;
if (data.is_major_producer !== undefined) updatePayload.is_major_producer = data.is_major_producer;
updatePayload.updatedById = currentUser.id;
await mining_companies.update(updatePayload, {transaction});
if (data.equity_asset !== undefined) {
await mining_companies.setEquity_asset(
data.equity_asset,
{ transaction }
);
}
return mining_companies;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const mining_companies = await db.mining_companies.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of mining_companies) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of mining_companies) {
await record.destroy({transaction});
}
});
return mining_companies;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const mining_companies = await db.mining_companies.findByPk(id, options);
await mining_companies.update({
deletedBy: currentUser.id
}, {
transaction,
});
await mining_companies.destroy({
transaction
});
return mining_companies;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const mining_companies = await db.mining_companies.findOne(
{ where },
{ transaction },
);
if (!mining_companies) {
return mining_companies;
}
const output = mining_companies.get({plain: true});
output.mining_fundamentals_mining_company = await mining_companies.getMining_fundamentals_mining_company({
transaction
});
output.equity_asset = await mining_companies.getEquity_asset({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.assets,
as: 'equity_asset',
where: filter.equity_asset ? {
[Op.or]: [
{ id: { [Op.in]: filter.equity_asset.split('|').map(term => Utils.uuid(term)) } },
{
symbol: {
[Op.or]: filter.equity_asset.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.ticker) {
where = {
...where,
[Op.and]: Utils.ilike(
'mining_companies',
'ticker',
filter.ticker,
),
};
}
if (filter.company_name) {
where = {
...where,
[Op.and]: Utils.ilike(
'mining_companies',
'company_name',
filter.company_name,
),
};
}
if (filter.country) {
where = {
...where,
[Op.and]: Utils.ilike(
'mining_companies',
'country',
filter.country,
),
};
}
if (filter.primary_mines) {
where = {
...where,
[Op.and]: Utils.ilike(
'mining_companies',
'primary_mines',
filter.primary_mines,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.is_major_producer) {
where = {
...where,
is_major_producer: filter.is_major_producer,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.mining_companies.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'mining_companies',
'company_name',
query,
),
],
};
}
const records = await db.mining_companies.findAll({
attributes: [ 'id', 'company_name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['company_name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.company_name,
}));
}
};

View File

@ -0,0 +1,794 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Mining_fundamentalsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const mining_fundamentals = await db.mining_fundamentals.create(
{
id: data.id || undefined,
reporting_period: data.reporting_period
||
null
,
period_end_at: data.period_end_at
||
null
,
production_oz: data.production_oz
||
null
,
all_in_sustaining_cost: data.all_in_sustaining_cost
||
null
,
cash_cost: data.cash_cost
||
null
,
reserves_oz: data.reserves_oz
||
null
,
revenue: data.revenue
||
null
,
ebitda: data.ebitda
||
null
,
free_cash_flow: data.free_cash_flow
||
null
,
debt_to_equity: data.debt_to_equity
||
null
,
operating_margin: data.operating_margin
||
null
,
notes: data.notes
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await mining_fundamentals.setMining_company( data.mining_company || null, {
transaction,
});
return mining_fundamentals;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const mining_fundamentalsData = data.map((item, index) => ({
id: item.id || undefined,
reporting_period: item.reporting_period
||
null
,
period_end_at: item.period_end_at
||
null
,
production_oz: item.production_oz
||
null
,
all_in_sustaining_cost: item.all_in_sustaining_cost
||
null
,
cash_cost: item.cash_cost
||
null
,
reserves_oz: item.reserves_oz
||
null
,
revenue: item.revenue
||
null
,
ebitda: item.ebitda
||
null
,
free_cash_flow: item.free_cash_flow
||
null
,
debt_to_equity: item.debt_to_equity
||
null
,
operating_margin: item.operating_margin
||
null
,
notes: item.notes
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const mining_fundamentals = await db.mining_fundamentals.bulkCreate(mining_fundamentalsData, { transaction });
// For each item created, replace relation files
return mining_fundamentals;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const mining_fundamentals = await db.mining_fundamentals.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.reporting_period !== undefined) updatePayload.reporting_period = data.reporting_period;
if (data.period_end_at !== undefined) updatePayload.period_end_at = data.period_end_at;
if (data.production_oz !== undefined) updatePayload.production_oz = data.production_oz;
if (data.all_in_sustaining_cost !== undefined) updatePayload.all_in_sustaining_cost = data.all_in_sustaining_cost;
if (data.cash_cost !== undefined) updatePayload.cash_cost = data.cash_cost;
if (data.reserves_oz !== undefined) updatePayload.reserves_oz = data.reserves_oz;
if (data.revenue !== undefined) updatePayload.revenue = data.revenue;
if (data.ebitda !== undefined) updatePayload.ebitda = data.ebitda;
if (data.free_cash_flow !== undefined) updatePayload.free_cash_flow = data.free_cash_flow;
if (data.debt_to_equity !== undefined) updatePayload.debt_to_equity = data.debt_to_equity;
if (data.operating_margin !== undefined) updatePayload.operating_margin = data.operating_margin;
if (data.notes !== undefined) updatePayload.notes = data.notes;
updatePayload.updatedById = currentUser.id;
await mining_fundamentals.update(updatePayload, {transaction});
if (data.mining_company !== undefined) {
await mining_fundamentals.setMining_company(
data.mining_company,
{ transaction }
);
}
return mining_fundamentals;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const mining_fundamentals = await db.mining_fundamentals.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of mining_fundamentals) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of mining_fundamentals) {
await record.destroy({transaction});
}
});
return mining_fundamentals;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const mining_fundamentals = await db.mining_fundamentals.findByPk(id, options);
await mining_fundamentals.update({
deletedBy: currentUser.id
}, {
transaction,
});
await mining_fundamentals.destroy({
transaction
});
return mining_fundamentals;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const mining_fundamentals = await db.mining_fundamentals.findOne(
{ where },
{ transaction },
);
if (!mining_fundamentals) {
return mining_fundamentals;
}
const output = mining_fundamentals.get({plain: true});
output.mining_company = await mining_fundamentals.getMining_company({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.mining_companies,
as: 'mining_company',
where: filter.mining_company ? {
[Op.or]: [
{ id: { [Op.in]: filter.mining_company.split('|').map(term => Utils.uuid(term)) } },
{
company_name: {
[Op.or]: filter.mining_company.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.notes) {
where = {
...where,
[Op.and]: Utils.ilike(
'mining_fundamentals',
'notes',
filter.notes,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
period_end_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
period_end_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.period_end_atRange) {
const [start, end] = filter.period_end_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
period_end_at: {
...where.period_end_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
period_end_at: {
...where.period_end_at,
[Op.lte]: end,
},
};
}
}
if (filter.production_ozRange) {
const [start, end] = filter.production_ozRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
production_oz: {
...where.production_oz,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
production_oz: {
...where.production_oz,
[Op.lte]: end,
},
};
}
}
if (filter.all_in_sustaining_costRange) {
const [start, end] = filter.all_in_sustaining_costRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
all_in_sustaining_cost: {
...where.all_in_sustaining_cost,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
all_in_sustaining_cost: {
...where.all_in_sustaining_cost,
[Op.lte]: end,
},
};
}
}
if (filter.cash_costRange) {
const [start, end] = filter.cash_costRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
cash_cost: {
...where.cash_cost,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
cash_cost: {
...where.cash_cost,
[Op.lte]: end,
},
};
}
}
if (filter.reserves_ozRange) {
const [start, end] = filter.reserves_ozRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
reserves_oz: {
...where.reserves_oz,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
reserves_oz: {
...where.reserves_oz,
[Op.lte]: end,
},
};
}
}
if (filter.revenueRange) {
const [start, end] = filter.revenueRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
revenue: {
...where.revenue,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
revenue: {
...where.revenue,
[Op.lte]: end,
},
};
}
}
if (filter.ebitdaRange) {
const [start, end] = filter.ebitdaRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
ebitda: {
...where.ebitda,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
ebitda: {
...where.ebitda,
[Op.lte]: end,
},
};
}
}
if (filter.free_cash_flowRange) {
const [start, end] = filter.free_cash_flowRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
free_cash_flow: {
...where.free_cash_flow,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
free_cash_flow: {
...where.free_cash_flow,
[Op.lte]: end,
},
};
}
}
if (filter.debt_to_equityRange) {
const [start, end] = filter.debt_to_equityRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
debt_to_equity: {
...where.debt_to_equity,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
debt_to_equity: {
...where.debt_to_equity,
[Op.lte]: end,
},
};
}
}
if (filter.operating_marginRange) {
const [start, end] = filter.operating_marginRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
operating_margin: {
...where.operating_margin,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
operating_margin: {
...where.operating_margin,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.reporting_period) {
where = {
...where,
reporting_period: filter.reporting_period,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.mining_fundamentals.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'mining_fundamentals',
'reporting_period',
query,
),
],
};
}
const records = await db.mining_fundamentals.findAll({
attributes: [ 'id', 'reporting_period' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['reporting_period', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.reporting_period,
}));
}
};

View File

@ -0,0 +1,566 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Model_runsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const model_runs = await db.model_runs.create(
{
id: data.id || undefined,
run_type: data.run_type
||
null
,
run_status: data.run_status
||
null
,
started_at: data.started_at
||
null
,
ended_at: data.ended_at
||
null
,
data_window: data.data_window
||
null
,
metrics_summary: data.metrics_summary
||
null
,
error_details: data.error_details
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await model_runs.setModel( data.model || null, {
transaction,
});
return model_runs;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const model_runsData = data.map((item, index) => ({
id: item.id || undefined,
run_type: item.run_type
||
null
,
run_status: item.run_status
||
null
,
started_at: item.started_at
||
null
,
ended_at: item.ended_at
||
null
,
data_window: item.data_window
||
null
,
metrics_summary: item.metrics_summary
||
null
,
error_details: item.error_details
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const model_runs = await db.model_runs.bulkCreate(model_runsData, { transaction });
// For each item created, replace relation files
return model_runs;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const model_runs = await db.model_runs.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.run_type !== undefined) updatePayload.run_type = data.run_type;
if (data.run_status !== undefined) updatePayload.run_status = data.run_status;
if (data.started_at !== undefined) updatePayload.started_at = data.started_at;
if (data.ended_at !== undefined) updatePayload.ended_at = data.ended_at;
if (data.data_window !== undefined) updatePayload.data_window = data.data_window;
if (data.metrics_summary !== undefined) updatePayload.metrics_summary = data.metrics_summary;
if (data.error_details !== undefined) updatePayload.error_details = data.error_details;
updatePayload.updatedById = currentUser.id;
await model_runs.update(updatePayload, {transaction});
if (data.model !== undefined) {
await model_runs.setModel(
data.model,
{ transaction }
);
}
return model_runs;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const model_runs = await db.model_runs.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of model_runs) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of model_runs) {
await record.destroy({transaction});
}
});
return model_runs;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const model_runs = await db.model_runs.findByPk(id, options);
await model_runs.update({
deletedBy: currentUser.id
}, {
transaction,
});
await model_runs.destroy({
transaction
});
return model_runs;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const model_runs = await db.model_runs.findOne(
{ where },
{ transaction },
);
if (!model_runs) {
return model_runs;
}
const output = model_runs.get({plain: true});
output.model = await model_runs.getModel({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.models,
as: 'model',
where: filter.model ? {
[Op.or]: [
{ id: { [Op.in]: filter.model.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.model.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.data_window) {
where = {
...where,
[Op.and]: Utils.ilike(
'model_runs',
'data_window',
filter.data_window,
),
};
}
if (filter.metrics_summary) {
where = {
...where,
[Op.and]: Utils.ilike(
'model_runs',
'metrics_summary',
filter.metrics_summary,
),
};
}
if (filter.error_details) {
where = {
...where,
[Op.and]: Utils.ilike(
'model_runs',
'error_details',
filter.error_details,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
started_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
ended_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.started_atRange) {
const [start, end] = filter.started_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
started_at: {
...where.started_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
started_at: {
...where.started_at,
[Op.lte]: end,
},
};
}
}
if (filter.ended_atRange) {
const [start, end] = filter.ended_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
ended_at: {
...where.ended_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
ended_at: {
...where.ended_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.run_type) {
where = {
...where,
run_type: filter.run_type,
};
}
if (filter.run_status) {
where = {
...where,
run_status: filter.run_status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.model_runs.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'model_runs',
'run_type',
query,
),
],
};
}
const records = await db.model_runs.findAll({
attributes: [ 'id', 'run_type' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['run_type', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.run_type,
}));
}
};

View File

@ -0,0 +1,641 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class ModelsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const models = await db.models.create(
{
id: data.id || undefined,
name: data.name
||
null
,
model_family: data.model_family
||
null
,
training_mode: data.training_mode
||
null
,
status: data.status
||
null
,
objective: data.objective
||
null
,
target_metric_value: data.target_metric_value
||
null
,
last_trained_at: data.last_trained_at
||
null
,
artifact_uri: data.artifact_uri
||
null
,
config_snapshot: data.config_snapshot
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await models.setFeature_set( data.feature_set || null, {
transaction,
});
await models.setOwner( data.owner || null, {
transaction,
});
return models;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const modelsData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
model_family: item.model_family
||
null
,
training_mode: item.training_mode
||
null
,
status: item.status
||
null
,
objective: item.objective
||
null
,
target_metric_value: item.target_metric_value
||
null
,
last_trained_at: item.last_trained_at
||
null
,
artifact_uri: item.artifact_uri
||
null
,
config_snapshot: item.config_snapshot
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const models = await db.models.bulkCreate(modelsData, { transaction });
// For each item created, replace relation files
return models;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const models = await db.models.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.model_family !== undefined) updatePayload.model_family = data.model_family;
if (data.training_mode !== undefined) updatePayload.training_mode = data.training_mode;
if (data.status !== undefined) updatePayload.status = data.status;
if (data.objective !== undefined) updatePayload.objective = data.objective;
if (data.target_metric_value !== undefined) updatePayload.target_metric_value = data.target_metric_value;
if (data.last_trained_at !== undefined) updatePayload.last_trained_at = data.last_trained_at;
if (data.artifact_uri !== undefined) updatePayload.artifact_uri = data.artifact_uri;
if (data.config_snapshot !== undefined) updatePayload.config_snapshot = data.config_snapshot;
updatePayload.updatedById = currentUser.id;
await models.update(updatePayload, {transaction});
if (data.feature_set !== undefined) {
await models.setFeature_set(
data.feature_set,
{ transaction }
);
}
if (data.owner !== undefined) {
await models.setOwner(
data.owner,
{ transaction }
);
}
return models;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const models = await db.models.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of models) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of models) {
await record.destroy({transaction});
}
});
return models;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const models = await db.models.findByPk(id, options);
await models.update({
deletedBy: currentUser.id
}, {
transaction,
});
await models.destroy({
transaction
});
return models;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const models = await db.models.findOne(
{ where },
{ transaction },
);
if (!models) {
return models;
}
const output = models.get({plain: true});
output.model_runs_model = await models.getModel_runs_model({
transaction
});
output.forecasts_model = await models.getForecasts_model({
transaction
});
output.alerts_target_model = await models.getAlerts_target_model({
transaction
});
output.feature_set = await models.getFeature_set({
transaction
});
output.owner = await models.getOwner({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.feature_sets,
as: 'feature_set',
where: filter.feature_set ? {
[Op.or]: [
{ id: { [Op.in]: filter.feature_set.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.feature_set.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.users,
as: 'owner',
where: filter.owner ? {
[Op.or]: [
{ id: { [Op.in]: filter.owner.split('|').map(term => Utils.uuid(term)) } },
{
firstName: {
[Op.or]: filter.owner.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'models',
'name',
filter.name,
),
};
}
if (filter.objective) {
where = {
...where,
[Op.and]: Utils.ilike(
'models',
'objective',
filter.objective,
),
};
}
if (filter.artifact_uri) {
where = {
...where,
[Op.and]: Utils.ilike(
'models',
'artifact_uri',
filter.artifact_uri,
),
};
}
if (filter.config_snapshot) {
where = {
...where,
[Op.and]: Utils.ilike(
'models',
'config_snapshot',
filter.config_snapshot,
),
};
}
if (filter.target_metric_valueRange) {
const [start, end] = filter.target_metric_valueRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
target_metric_value: {
...where.target_metric_value,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
target_metric_value: {
...where.target_metric_value,
[Op.lte]: end,
},
};
}
}
if (filter.last_trained_atRange) {
const [start, end] = filter.last_trained_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
last_trained_at: {
...where.last_trained_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
last_trained_at: {
...where.last_trained_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.model_family) {
where = {
...where,
model_family: filter.model_family,
};
}
if (filter.training_mode) {
where = {
...where,
training_mode: filter.training_mode,
};
}
if (filter.status) {
where = {
...where,
status: filter.status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.models.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'models',
'name',
query,
),
],
};
}
const records = await db.models.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,349 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class PermissionsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const permissions = await db.permissions.create(
{
id: data.id || undefined,
name: data.name
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
return permissions;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const permissionsData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const permissions = await db.permissions.bulkCreate(permissionsData, { transaction });
// For each item created, replace relation files
return permissions;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const permissions = await db.permissions.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
updatePayload.updatedById = currentUser.id;
await permissions.update(updatePayload, {transaction});
return permissions;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const permissions = await db.permissions.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of permissions) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of permissions) {
await record.destroy({transaction});
}
});
return permissions;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const permissions = await db.permissions.findByPk(id, options);
await permissions.update({
deletedBy: currentUser.id
}, {
transaction,
});
await permissions.destroy({
transaction
});
return permissions;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const permissions = await db.permissions.findOne(
{ where },
{ transaction },
);
if (!permissions) {
return permissions;
}
const output = permissions.get({plain: true});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'permissions',
'name',
filter.name,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.permissions.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'permissions',
'name',
query,
),
],
};
}
const records = await db.permissions.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

419
backend/src/db/api/roles.js Normal file
View File

@ -0,0 +1,419 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class RolesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const roles = await db.roles.create(
{
id: data.id || undefined,
name: data.name
||
null
,
role_customization: data.role_customization
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await roles.setPermissions(data.permissions || [], {
transaction,
});
return roles;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const rolesData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
role_customization: item.role_customization
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const roles = await db.roles.bulkCreate(rolesData, { transaction });
// For each item created, replace relation files
return roles;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const roles = await db.roles.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.role_customization !== undefined) updatePayload.role_customization = data.role_customization;
updatePayload.updatedById = currentUser.id;
await roles.update(updatePayload, {transaction});
if (data.permissions !== undefined) {
await roles.setPermissions(data.permissions, { transaction });
}
return roles;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const roles = await db.roles.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of roles) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of roles) {
await record.destroy({transaction});
}
});
return roles;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const roles = await db.roles.findByPk(id, options);
await roles.update({
deletedBy: currentUser.id
}, {
transaction,
});
await roles.destroy({
transaction
});
return roles;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const roles = await db.roles.findOne(
{ where },
{ transaction },
);
if (!roles) {
return roles;
}
const output = roles.get({plain: true});
output.users_app_role = await roles.getUsers_app_role({
transaction
});
output.permissions = await roles.getPermissions({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.permissions,
as: 'permissions',
required: false,
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'roles',
'name',
filter.name,
),
};
}
if (filter.role_customization) {
where = {
...where,
[Op.and]: Utils.ilike(
'roles',
'role_customization',
filter.role_customization,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.permissions) {
const searchTerms = filter.permissions.split('|');
include = [
{
model: db.permissions,
as: 'permissions_filter',
required: searchTerms.length > 0,
where: searchTerms.length > 0 ? {
[Op.or]: [
{ id: { [Op.in]: searchTerms.map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: searchTerms.map(term => ({ [Op.iLike]: `%${term}%` }))
}
}
]
} : undefined
},
...include,
]
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.roles.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'roles',
'name',
query,
),
],
};
}
const records = await db.roles.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,571 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Scenario_resultsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const scenario_results = await db.scenario_results.create(
{
id: data.id || undefined,
delta_point_estimate: data.delta_point_estimate
||
null
,
delta_p50: data.delta_p50
||
null
,
delta_volatility: data.delta_volatility
||
null
,
pnl_proxy: data.pnl_proxy
||
null
,
narrative: data.narrative
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await scenario_results.setScenario( data.scenario || null, {
transaction,
});
await scenario_results.setForecast( data.forecast || null, {
transaction,
});
return scenario_results;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const scenario_resultsData = data.map((item, index) => ({
id: item.id || undefined,
delta_point_estimate: item.delta_point_estimate
||
null
,
delta_p50: item.delta_p50
||
null
,
delta_volatility: item.delta_volatility
||
null
,
pnl_proxy: item.pnl_proxy
||
null
,
narrative: item.narrative
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const scenario_results = await db.scenario_results.bulkCreate(scenario_resultsData, { transaction });
// For each item created, replace relation files
return scenario_results;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const scenario_results = await db.scenario_results.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.delta_point_estimate !== undefined) updatePayload.delta_point_estimate = data.delta_point_estimate;
if (data.delta_p50 !== undefined) updatePayload.delta_p50 = data.delta_p50;
if (data.delta_volatility !== undefined) updatePayload.delta_volatility = data.delta_volatility;
if (data.pnl_proxy !== undefined) updatePayload.pnl_proxy = data.pnl_proxy;
if (data.narrative !== undefined) updatePayload.narrative = data.narrative;
updatePayload.updatedById = currentUser.id;
await scenario_results.update(updatePayload, {transaction});
if (data.scenario !== undefined) {
await scenario_results.setScenario(
data.scenario,
{ transaction }
);
}
if (data.forecast !== undefined) {
await scenario_results.setForecast(
data.forecast,
{ transaction }
);
}
return scenario_results;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const scenario_results = await db.scenario_results.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of scenario_results) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of scenario_results) {
await record.destroy({transaction});
}
});
return scenario_results;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const scenario_results = await db.scenario_results.findByPk(id, options);
await scenario_results.update({
deletedBy: currentUser.id
}, {
transaction,
});
await scenario_results.destroy({
transaction
});
return scenario_results;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const scenario_results = await db.scenario_results.findOne(
{ where },
{ transaction },
);
if (!scenario_results) {
return scenario_results;
}
const output = scenario_results.get({plain: true});
output.scenario = await scenario_results.getScenario({
transaction
});
output.forecast = await scenario_results.getForecast({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.scenarios,
as: 'scenario',
where: filter.scenario ? {
[Op.or]: [
{ id: { [Op.in]: filter.scenario.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.scenario.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.forecasts,
as: 'forecast',
where: filter.forecast ? {
[Op.or]: [
{ id: { [Op.in]: filter.forecast.split('|').map(term => Utils.uuid(term)) } },
{
horizon: {
[Op.or]: filter.forecast.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.narrative) {
where = {
...where,
[Op.and]: Utils.ilike(
'scenario_results',
'narrative',
filter.narrative,
),
};
}
if (filter.delta_point_estimateRange) {
const [start, end] = filter.delta_point_estimateRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
delta_point_estimate: {
...where.delta_point_estimate,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
delta_point_estimate: {
...where.delta_point_estimate,
[Op.lte]: end,
},
};
}
}
if (filter.delta_p50Range) {
const [start, end] = filter.delta_p50Range;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
delta_p50: {
...where.delta_p50,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
delta_p50: {
...where.delta_p50,
[Op.lte]: end,
},
};
}
}
if (filter.delta_volatilityRange) {
const [start, end] = filter.delta_volatilityRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
delta_volatility: {
...where.delta_volatility,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
delta_volatility: {
...where.delta_volatility,
[Op.lte]: end,
},
};
}
}
if (filter.pnl_proxyRange) {
const [start, end] = filter.pnl_proxyRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
pnl_proxy: {
...where.pnl_proxy,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
pnl_proxy: {
...where.pnl_proxy,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.scenario_results.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'scenario_results',
'pnl_proxy',
query,
),
],
};
}
const records = await db.scenario_results.findAll({
attributes: [ 'id', 'pnl_proxy' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['pnl_proxy', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.pnl_proxy,
}));
}
};

View File

@ -0,0 +1,572 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Scenario_shocksDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const scenario_shocks = await db.scenario_shocks.create(
{
id: data.id || undefined,
shock_style: data.shock_style
||
null
,
shock_value: data.shock_value
||
null
,
shock_start_at: data.shock_start_at
||
null
,
shock_end_at: data.shock_end_at
||
null
,
shock_path_note: data.shock_path_note
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await scenario_shocks.setScenario( data.scenario || null, {
transaction,
});
await scenario_shocks.setAsset( data.asset || null, {
transaction,
});
return scenario_shocks;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const scenario_shocksData = data.map((item, index) => ({
id: item.id || undefined,
shock_style: item.shock_style
||
null
,
shock_value: item.shock_value
||
null
,
shock_start_at: item.shock_start_at
||
null
,
shock_end_at: item.shock_end_at
||
null
,
shock_path_note: item.shock_path_note
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const scenario_shocks = await db.scenario_shocks.bulkCreate(scenario_shocksData, { transaction });
// For each item created, replace relation files
return scenario_shocks;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const scenario_shocks = await db.scenario_shocks.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.shock_style !== undefined) updatePayload.shock_style = data.shock_style;
if (data.shock_value !== undefined) updatePayload.shock_value = data.shock_value;
if (data.shock_start_at !== undefined) updatePayload.shock_start_at = data.shock_start_at;
if (data.shock_end_at !== undefined) updatePayload.shock_end_at = data.shock_end_at;
if (data.shock_path_note !== undefined) updatePayload.shock_path_note = data.shock_path_note;
updatePayload.updatedById = currentUser.id;
await scenario_shocks.update(updatePayload, {transaction});
if (data.scenario !== undefined) {
await scenario_shocks.setScenario(
data.scenario,
{ transaction }
);
}
if (data.asset !== undefined) {
await scenario_shocks.setAsset(
data.asset,
{ transaction }
);
}
return scenario_shocks;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const scenario_shocks = await db.scenario_shocks.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of scenario_shocks) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of scenario_shocks) {
await record.destroy({transaction});
}
});
return scenario_shocks;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const scenario_shocks = await db.scenario_shocks.findByPk(id, options);
await scenario_shocks.update({
deletedBy: currentUser.id
}, {
transaction,
});
await scenario_shocks.destroy({
transaction
});
return scenario_shocks;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const scenario_shocks = await db.scenario_shocks.findOne(
{ where },
{ transaction },
);
if (!scenario_shocks) {
return scenario_shocks;
}
const output = scenario_shocks.get({plain: true});
output.scenario = await scenario_shocks.getScenario({
transaction
});
output.asset = await scenario_shocks.getAsset({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.scenarios,
as: 'scenario',
where: filter.scenario ? {
[Op.or]: [
{ id: { [Op.in]: filter.scenario.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.scenario.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.assets,
as: 'asset',
where: filter.asset ? {
[Op.or]: [
{ id: { [Op.in]: filter.asset.split('|').map(term => Utils.uuid(term)) } },
{
symbol: {
[Op.or]: filter.asset.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.shock_path_note) {
where = {
...where,
[Op.and]: Utils.ilike(
'scenario_shocks',
'shock_path_note',
filter.shock_path_note,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
shock_start_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
shock_end_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.shock_valueRange) {
const [start, end] = filter.shock_valueRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
shock_value: {
...where.shock_value,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
shock_value: {
...where.shock_value,
[Op.lte]: end,
},
};
}
}
if (filter.shock_start_atRange) {
const [start, end] = filter.shock_start_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
shock_start_at: {
...where.shock_start_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
shock_start_at: {
...where.shock_start_at,
[Op.lte]: end,
},
};
}
}
if (filter.shock_end_atRange) {
const [start, end] = filter.shock_end_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
shock_end_at: {
...where.shock_end_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
shock_end_at: {
...where.shock_end_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.shock_style) {
where = {
...where,
shock_style: filter.shock_style,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.scenario_shocks.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'scenario_shocks',
'shock_style',
query,
),
],
};
}
const records = await db.scenario_shocks.findAll({
attributes: [ 'id', 'shock_style' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['shock_style', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.shock_style,
}));
}
};

View File

@ -0,0 +1,569 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class ScenariosDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const scenarios = await db.scenarios.create(
{
id: data.id || undefined,
name: data.name
||
null
,
scenario_type: data.scenario_type
||
null
,
status: data.status
||
null
,
description: data.description
||
null
,
valid_from_at: data.valid_from_at
||
null
,
valid_to_at: data.valid_to_at
||
null
,
probability_weight: data.probability_weight
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await scenarios.setOwner( data.owner || null, {
transaction,
});
return scenarios;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const scenariosData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
scenario_type: item.scenario_type
||
null
,
status: item.status
||
null
,
description: item.description
||
null
,
valid_from_at: item.valid_from_at
||
null
,
valid_to_at: item.valid_to_at
||
null
,
probability_weight: item.probability_weight
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const scenarios = await db.scenarios.bulkCreate(scenariosData, { transaction });
// For each item created, replace relation files
return scenarios;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const scenarios = await db.scenarios.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.scenario_type !== undefined) updatePayload.scenario_type = data.scenario_type;
if (data.status !== undefined) updatePayload.status = data.status;
if (data.description !== undefined) updatePayload.description = data.description;
if (data.valid_from_at !== undefined) updatePayload.valid_from_at = data.valid_from_at;
if (data.valid_to_at !== undefined) updatePayload.valid_to_at = data.valid_to_at;
if (data.probability_weight !== undefined) updatePayload.probability_weight = data.probability_weight;
updatePayload.updatedById = currentUser.id;
await scenarios.update(updatePayload, {transaction});
if (data.owner !== undefined) {
await scenarios.setOwner(
data.owner,
{ transaction }
);
}
return scenarios;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const scenarios = await db.scenarios.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of scenarios) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of scenarios) {
await record.destroy({transaction});
}
});
return scenarios;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const scenarios = await db.scenarios.findByPk(id, options);
await scenarios.update({
deletedBy: currentUser.id
}, {
transaction,
});
await scenarios.destroy({
transaction
});
return scenarios;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const scenarios = await db.scenarios.findOne(
{ where },
{ transaction },
);
if (!scenarios) {
return scenarios;
}
const output = scenarios.get({plain: true});
output.scenario_shocks_scenario = await scenarios.getScenario_shocks_scenario({
transaction
});
output.scenario_results_scenario = await scenarios.getScenario_results_scenario({
transaction
});
output.owner = await scenarios.getOwner({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.users,
as: 'owner',
where: filter.owner ? {
[Op.or]: [
{ id: { [Op.in]: filter.owner.split('|').map(term => Utils.uuid(term)) } },
{
firstName: {
[Op.or]: filter.owner.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'scenarios',
'name',
filter.name,
),
};
}
if (filter.description) {
where = {
...where,
[Op.and]: Utils.ilike(
'scenarios',
'description',
filter.description,
),
};
}
if (filter.valid_from_atRange) {
const [start, end] = filter.valid_from_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
valid_from_at: {
...where.valid_from_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
valid_from_at: {
...where.valid_from_at,
[Op.lte]: end,
},
};
}
}
if (filter.valid_to_atRange) {
const [start, end] = filter.valid_to_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
valid_to_at: {
...where.valid_to_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
valid_to_at: {
...where.valid_to_at,
[Op.lte]: end,
},
};
}
}
if (filter.probability_weightRange) {
const [start, end] = filter.probability_weightRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
probability_weight: {
...where.probability_weight,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
probability_weight: {
...where.probability_weight,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.scenario_type) {
where = {
...where,
scenario_type: filter.scenario_type,
};
}
if (filter.status) {
where = {
...where,
status: filter.status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.scenarios.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'scenarios',
'name',
query,
),
],
};
}
const records = await db.scenarios.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,683 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Time_seriesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const time_series = await db.time_series.create(
{
id: data.id || undefined,
timestamp: data.timestamp
||
null
,
bar_size: data.bar_size
||
null
,
open: data.open
||
null
,
high: data.high
||
null
,
low: data.low
||
null
,
close: data.close
||
null
,
volume: data.volume
||
null
,
vwap: data.vwap
||
null
,
is_imputed: data.is_imputed
||
false
,
quality_flag: data.quality_flag
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await time_series.setAsset( data.asset || null, {
transaction,
});
return time_series;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const time_seriesData = data.map((item, index) => ({
id: item.id || undefined,
timestamp: item.timestamp
||
null
,
bar_size: item.bar_size
||
null
,
open: item.open
||
null
,
high: item.high
||
null
,
low: item.low
||
null
,
close: item.close
||
null
,
volume: item.volume
||
null
,
vwap: item.vwap
||
null
,
is_imputed: item.is_imputed
||
false
,
quality_flag: item.quality_flag
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const time_series = await db.time_series.bulkCreate(time_seriesData, { transaction });
// For each item created, replace relation files
return time_series;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const time_series = await db.time_series.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.timestamp !== undefined) updatePayload.timestamp = data.timestamp;
if (data.bar_size !== undefined) updatePayload.bar_size = data.bar_size;
if (data.open !== undefined) updatePayload.open = data.open;
if (data.high !== undefined) updatePayload.high = data.high;
if (data.low !== undefined) updatePayload.low = data.low;
if (data.close !== undefined) updatePayload.close = data.close;
if (data.volume !== undefined) updatePayload.volume = data.volume;
if (data.vwap !== undefined) updatePayload.vwap = data.vwap;
if (data.is_imputed !== undefined) updatePayload.is_imputed = data.is_imputed;
if (data.quality_flag !== undefined) updatePayload.quality_flag = data.quality_flag;
updatePayload.updatedById = currentUser.id;
await time_series.update(updatePayload, {transaction});
if (data.asset !== undefined) {
await time_series.setAsset(
data.asset,
{ transaction }
);
}
return time_series;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const time_series = await db.time_series.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of time_series) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of time_series) {
await record.destroy({transaction});
}
});
return time_series;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const time_series = await db.time_series.findByPk(id, options);
await time_series.update({
deletedBy: currentUser.id
}, {
transaction,
});
await time_series.destroy({
transaction
});
return time_series;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const time_series = await db.time_series.findOne(
{ where },
{ transaction },
);
if (!time_series) {
return time_series;
}
const output = time_series.get({plain: true});
output.asset = await time_series.getAsset({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.assets,
as: 'asset',
where: filter.asset ? {
[Op.or]: [
{ id: { [Op.in]: filter.asset.split('|').map(term => Utils.uuid(term)) } },
{
symbol: {
[Op.or]: filter.asset.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.timestampRange) {
const [start, end] = filter.timestampRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
timestamp: {
...where.timestamp,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
timestamp: {
...where.timestamp,
[Op.lte]: end,
},
};
}
}
if (filter.openRange) {
const [start, end] = filter.openRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
open: {
...where.open,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
open: {
...where.open,
[Op.lte]: end,
},
};
}
}
if (filter.highRange) {
const [start, end] = filter.highRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
high: {
...where.high,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
high: {
...where.high,
[Op.lte]: end,
},
};
}
}
if (filter.lowRange) {
const [start, end] = filter.lowRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
low: {
...where.low,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
low: {
...where.low,
[Op.lte]: end,
},
};
}
}
if (filter.closeRange) {
const [start, end] = filter.closeRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
close: {
...where.close,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
close: {
...where.close,
[Op.lte]: end,
},
};
}
}
if (filter.volumeRange) {
const [start, end] = filter.volumeRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
volume: {
...where.volume,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
volume: {
...where.volume,
[Op.lte]: end,
},
};
}
}
if (filter.vwapRange) {
const [start, end] = filter.vwapRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
vwap: {
...where.vwap,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
vwap: {
...where.vwap,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.bar_size) {
where = {
...where,
bar_size: filter.bar_size,
};
}
if (filter.is_imputed) {
where = {
...where,
is_imputed: filter.is_imputed,
};
}
if (filter.quality_flag) {
where = {
...where,
quality_flag: filter.quality_flag,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.time_series.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'time_series',
'bar_size',
query,
),
],
};
}
const records = await db.time_series.findAll({
attributes: [ 'id', 'bar_size' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['bar_size', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.bar_size,
}));
}
};

971
backend/src/db/api/users.js Normal file
View File

@ -0,0 +1,971 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const bcrypt = require('bcrypt');
const config = require('../../config');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class UsersDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const users = await db.users.create(
{
id: data.data.id || undefined,
firstName: data.data.firstName
||
null
,
lastName: data.data.lastName
||
null
,
phoneNumber: data.data.phoneNumber
||
null
,
email: data.data.email
||
null
,
disabled: data.data.disabled
||
false
,
password: data.data.password
||
null
,
emailVerified: data.data.emailVerified
||
true
,
emailVerificationToken: data.data.emailVerificationToken
||
null
,
emailVerificationTokenExpiresAt: data.data.emailVerificationTokenExpiresAt
||
null
,
passwordResetToken: data.data.passwordResetToken
||
null
,
passwordResetTokenExpiresAt: data.data.passwordResetTokenExpiresAt
||
null
,
provider: data.data.provider
||
null
,
importHash: data.data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
if (!data.data.app_role) {
const role = await db.roles.findOne({
where: { name: 'User' },
});
if (role) {
await users.setApp_role(role, {
transaction,
});
}
}else{
await users.setApp_role(data.data.app_role || null, {
transaction,
});
}
await users.setCustom_permissions(data.data.custom_permissions || [], {
transaction,
});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.users.getTableName(),
belongsToColumn: 'avatar',
belongsToId: users.id,
},
data.data.avatar,
options,
);
return users;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const usersData = data.map((item, index) => ({
id: item.id || undefined,
firstName: item.firstName
||
null
,
lastName: item.lastName
||
null
,
phoneNumber: item.phoneNumber
||
null
,
email: item.email
||
null
,
disabled: item.disabled
||
false
,
password: item.password
||
null
,
emailVerified: item.emailVerified
||
false
,
emailVerificationToken: item.emailVerificationToken
||
null
,
emailVerificationTokenExpiresAt: item.emailVerificationTokenExpiresAt
||
null
,
passwordResetToken: item.passwordResetToken
||
null
,
passwordResetTokenExpiresAt: item.passwordResetTokenExpiresAt
||
null
,
provider: item.provider
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const users = await db.users.bulkCreate(usersData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < users.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.users.getTableName(),
belongsToColumn: 'avatar',
belongsToId: users[i].id,
},
data[i].avatar,
options,
);
}
return users;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findByPk(id, {}, {transaction});
if (!data?.app_role) {
data.app_role = users?.app_role?.id;
}
if (!data?.custom_permissions) {
data.custom_permissions = users?.custom_permissions?.map(item => item.id);
}
if (data.password) {
data.password = bcrypt.hashSync(
data.password,
config.bcrypt.saltRounds,
);
} else {
data.password = users.password;
}
const updatePayload = {};
if (data.firstName !== undefined) updatePayload.firstName = data.firstName;
if (data.lastName !== undefined) updatePayload.lastName = data.lastName;
if (data.phoneNumber !== undefined) updatePayload.phoneNumber = data.phoneNumber;
if (data.email !== undefined) updatePayload.email = data.email;
if (data.disabled !== undefined) updatePayload.disabled = data.disabled;
if (data.password !== undefined) updatePayload.password = data.password;
if (data.emailVerified !== undefined) updatePayload.emailVerified = data.emailVerified;
else updatePayload.emailVerified = true;
if (data.emailVerificationToken !== undefined) updatePayload.emailVerificationToken = data.emailVerificationToken;
if (data.emailVerificationTokenExpiresAt !== undefined) updatePayload.emailVerificationTokenExpiresAt = data.emailVerificationTokenExpiresAt;
if (data.passwordResetToken !== undefined) updatePayload.passwordResetToken = data.passwordResetToken;
if (data.passwordResetTokenExpiresAt !== undefined) updatePayload.passwordResetTokenExpiresAt = data.passwordResetTokenExpiresAt;
if (data.provider !== undefined) updatePayload.provider = data.provider;
updatePayload.updatedById = currentUser.id;
await users.update(updatePayload, {transaction});
if (data.app_role !== undefined) {
await users.setApp_role(
data.app_role,
{ transaction }
);
}
if (data.custom_permissions !== undefined) {
await users.setCustom_permissions(data.custom_permissions, { transaction });
}
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.users.getTableName(),
belongsToColumn: 'avatar',
belongsToId: users.id,
},
data.avatar,
options,
);
return users;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of users) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of users) {
await record.destroy({transaction});
}
});
return users;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findByPk(id, options);
await users.update({
deletedBy: currentUser.id
}, {
transaction,
});
await users.destroy({
transaction
});
return users;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findOne(
{ where },
{ transaction },
);
if (!users) {
return users;
}
const output = users.get({plain: true});
output.models_owner = await users.getModels_owner({
transaction
});
output.scenarios_owner = await users.getScenarios_owner({
transaction
});
output.alerts_owner = await users.getAlerts_owner({
transaction
});
output.alert_events_acknowledged_by = await users.getAlert_events_acknowledged_by({
transaction
});
output.api_keys_owner = await users.getApi_keys_owner({
transaction
});
output.audit_events_actor = await users.getAudit_events_actor({
transaction
});
output.avatar = await users.getAvatar({
transaction
});
output.app_role = await users.getApp_role({
transaction
});
if (output.app_role) {
output.app_role_permissions = await output.app_role.getPermissions({
transaction,
});
}
output.custom_permissions = await users.getCustom_permissions({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.roles,
as: 'app_role',
where: filter.app_role ? {
[Op.or]: [
{ id: { [Op.in]: filter.app_role.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.app_role.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.permissions,
as: 'custom_permissions',
required: false,
},
{
model: db.file,
as: 'avatar',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.firstName) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'firstName',
filter.firstName,
),
};
}
if (filter.lastName) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'lastName',
filter.lastName,
),
};
}
if (filter.phoneNumber) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'phoneNumber',
filter.phoneNumber,
),
};
}
if (filter.email) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'email',
filter.email,
),
};
}
if (filter.password) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'password',
filter.password,
),
};
}
if (filter.emailVerificationToken) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'emailVerificationToken',
filter.emailVerificationToken,
),
};
}
if (filter.passwordResetToken) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'passwordResetToken',
filter.passwordResetToken,
),
};
}
if (filter.provider) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'provider',
filter.provider,
),
};
}
if (filter.emailVerificationTokenExpiresAtRange) {
const [start, end] = filter.emailVerificationTokenExpiresAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
emailVerificationTokenExpiresAt: {
...where.emailVerificationTokenExpiresAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
emailVerificationTokenExpiresAt: {
...where.emailVerificationTokenExpiresAt,
[Op.lte]: end,
},
};
}
}
if (filter.passwordResetTokenExpiresAtRange) {
const [start, end] = filter.passwordResetTokenExpiresAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
passwordResetTokenExpiresAt: {
...where.passwordResetTokenExpiresAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
passwordResetTokenExpiresAt: {
...where.passwordResetTokenExpiresAt,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.disabled) {
where = {
...where,
disabled: filter.disabled,
};
}
if (filter.emailVerified) {
where = {
...where,
emailVerified: filter.emailVerified,
};
}
if (filter.custom_permissions) {
const searchTerms = filter.custom_permissions.split('|');
include = [
{
model: db.permissions,
as: 'custom_permissions_filter',
required: searchTerms.length > 0,
where: searchTerms.length > 0 ? {
[Op.or]: [
{ id: { [Op.in]: searchTerms.map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: searchTerms.map(term => ({ [Op.iLike]: `%${term}%` }))
}
}
]
} : undefined
},
...include,
]
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.users.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'users',
'firstName',
query,
),
],
};
}
const records = await db.users.findAll({
attributes: [ 'id', 'firstName' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['firstName', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.firstName,
}));
}
static async createFromAuth(data, options) {
const transaction = (options && options.transaction) || undefined;
const users = await db.users.create(
{
email: data.email,
firstName: data.firstName,
authenticationUid: data.authenticationUid,
password: data.password,
},
{ transaction },
);
const app_role = await db.roles.findOne({
where: { name: config.roles?.user || "User" },
});
if (app_role?.id) {
await users.setApp_role(app_role?.id || null, {
transaction,
});
}
await users.update(
{
authenticationUid: users.id,
},
{ transaction },
);
delete users.password;
return users;
}
static async updatePassword(id, password, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findByPk(id, {
transaction,
});
await users.update(
{
password,
authenticationUid: id,
updatedById: currentUser.id,
},
{ transaction },
);
return users;
}
static async generateEmailVerificationToken(email, options) {
return this._generateToken(['emailVerificationToken', 'emailVerificationTokenExpiresAt'], email, options);
}
static async generatePasswordResetToken(email, options) {
return this._generateToken(['passwordResetToken', 'passwordResetTokenExpiresAt'], email, options);
}
static async findByPasswordResetToken(token, options) {
const transaction = (options && options.transaction) || undefined;
return db.users.findOne(
{
where: {
passwordResetToken: token,
passwordResetTokenExpiresAt: {
[db.Sequelize.Op.gt]: Date.now(),
},
},
},
{ transaction },
);
}
static async findByEmailVerificationToken(
token,
options,
) {
const transaction = (options && options.transaction) || undefined;
return db.users.findOne(
{
where: {
emailVerificationToken: token,
emailVerificationTokenExpiresAt: {
[db.Sequelize.Op.gt]: Date.now(),
},
},
},
{ transaction },
);
}
static async markEmailVerified(id, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findByPk(id, {
transaction,
});
await users.update(
{
emailVerified: true,
updatedById: currentUser.id,
},
{ transaction },
);
return true;
}
static async _generateToken(keyNames, email, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findOne(
{
where: { email: email.toLowerCase() },
},
{
transaction,
},
);
const token = crypto
.randomBytes(20)
.toString('hex');
const tokenExpiresAt = Date.now() + 360000;
if(users){
await users.update(
{
[keyNames[0]]: token,
[keyNames[1]]: tokenExpiresAt,
updatedById: currentUser.id,
},
{transaction},
);
}
return token;
}
};

View File

@ -0,0 +1,33 @@
module.exports = {
production: {
dialect: 'postgres',
username: process.env.DB_USER,
password: process.env.DB_PASS,
database: process.env.DB_NAME,
host: process.env.DB_HOST,
port: process.env.DB_PORT,
logging: console.log,
seederStorage: 'sequelize',
},
development: {
username: 'postgres',
dialect: 'postgres',
password: '',
database: 'db_gold_forecasting_engine',
host: process.env.DB_HOST || 'localhost',
logging: console.log,
seederStorage: 'sequelize',
},
dev_stage: {
dialect: 'postgres',
username: process.env.DB_USER,
password: process.env.DB_PASS,
database: process.env.DB_NAME,
host: process.env.DB_HOST,
port: process.env.DB_PORT,
logging: console.log,
seederStorage: 'sequelize',
}
};

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,124 @@
module.exports = {
async up(queryInterface, Sequelize) {
const transaction = await queryInterface.sequelize.transaction();
try {
const rows = await queryInterface.sequelize.query(
"SELECT to_regclass('public.files') AS regclass_name;",
{
transaction,
type: Sequelize.QueryTypes.SELECT,
},
);
const tableName = rows[0].regclass_name;
if (tableName) {
await transaction.commit();
return;
}
await queryInterface.createTable(
'files',
{
id: {
type: Sequelize.DataTypes.UUID,
defaultValue: Sequelize.DataTypes.UUIDV4,
primaryKey: true,
},
belongsTo: {
type: Sequelize.DataTypes.STRING(255),
allowNull: true,
},
belongsToId: {
type: Sequelize.DataTypes.UUID,
allowNull: true,
},
belongsToColumn: {
type: Sequelize.DataTypes.STRING(255),
allowNull: true,
},
name: {
type: Sequelize.DataTypes.STRING(2083),
allowNull: false,
},
sizeInBytes: {
type: Sequelize.DataTypes.INTEGER,
allowNull: true,
},
privateUrl: {
type: Sequelize.DataTypes.STRING(2083),
allowNull: true,
},
publicUrl: {
type: Sequelize.DataTypes.STRING(2083),
allowNull: false,
},
createdAt: {
type: Sequelize.DataTypes.DATE,
allowNull: false,
},
updatedAt: {
type: Sequelize.DataTypes.DATE,
allowNull: false,
},
deletedAt: {
type: Sequelize.DataTypes.DATE,
allowNull: true,
},
createdById: {
type: Sequelize.DataTypes.UUID,
allowNull: true,
references: {
key: 'id',
model: 'users',
},
onDelete: 'SET NULL',
onUpdate: 'CASCADE',
},
updatedById: {
type: Sequelize.DataTypes.UUID,
allowNull: true,
references: {
key: 'id',
model: 'users',
},
onDelete: 'SET NULL',
onUpdate: 'CASCADE',
},
},
{ transaction },
);
await transaction.commit();
} catch (err) {
await transaction.rollback();
throw err;
}
},
async down(queryInterface, Sequelize) {
const transaction = await queryInterface.sequelize.transaction();
try {
const rows = await queryInterface.sequelize.query(
"SELECT to_regclass('public.files') AS regclass_name;",
{
transaction,
type: Sequelize.QueryTypes.SELECT,
},
);
const tableName = rows[0].regclass_name;
if (!tableName) {
await transaction.commit();
return;
}
await queryInterface.dropTable('files', { transaction });
await transaction.commit();
} catch (err) {
await transaction.rollback();
throw err;
}
},
};

View File

@ -0,0 +1,77 @@
module.exports = {
async up(queryInterface, Sequelize) {
const transaction = await queryInterface.sequelize.transaction();
try {
const rows = await queryInterface.sequelize.query(
"SELECT to_regclass('public.\"usersCustom_permissionsPermissions\"') AS regclass_name;",
{
transaction,
type: Sequelize.QueryTypes.SELECT,
},
);
const tableName = rows[0].regclass_name;
if (tableName) {
await transaction.commit();
return;
}
await queryInterface.createTable(
'usersCustom_permissionsPermissions',
{
createdAt: {
type: Sequelize.DataTypes.DATE,
allowNull: false,
},
updatedAt: {
type: Sequelize.DataTypes.DATE,
allowNull: false,
},
users_custom_permissionsId: {
type: Sequelize.DataTypes.UUID,
allowNull: false,
primaryKey: true,
},
permissionId: {
type: Sequelize.DataTypes.UUID,
allowNull: false,
primaryKey: true,
},
},
{ transaction },
);
await transaction.commit();
} catch (err) {
await transaction.rollback();
throw err;
}
},
async down(queryInterface, Sequelize) {
const transaction = await queryInterface.sequelize.transaction();
try {
const rows = await queryInterface.sequelize.query(
"SELECT to_regclass('public.\"usersCustom_permissionsPermissions\"') AS regclass_name;",
{
transaction,
type: Sequelize.QueryTypes.SELECT,
},
);
const tableName = rows[0].regclass_name;
if (!tableName) {
await transaction.commit();
return;
}
await queryInterface.dropTable('usersCustom_permissionsPermissions', { transaction });
await transaction.commit();
} catch (err) {
await transaction.rollback();
throw err;
}
},
};

View File

@ -0,0 +1,137 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const alert_events = sequelize.define(
'alert_events',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
triggered_at: {
type: DataTypes.DATE,
},
state: {
type: DataTypes.ENUM,
values: [
"open",
"acknowledged",
"resolved"
],
},
message: {
type: DataTypes.TEXT,
},
observed_value: {
type: DataTypes.DECIMAL,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
alert_events.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.alert_events.belongsTo(db.alerts, {
as: 'alert',
foreignKey: {
name: 'alertId',
},
constraints: false,
});
db.alert_events.belongsTo(db.users, {
as: 'acknowledged_by',
foreignKey: {
name: 'acknowledged_byId',
},
constraints: false,
});
db.alert_events.belongsTo(db.users, {
as: 'createdBy',
});
db.alert_events.belongsTo(db.users, {
as: 'updatedBy',
});
};
return alert_events;
};

View File

@ -0,0 +1,210 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const alerts = sequelize.define(
'alerts',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
alert_type: {
type: DataTypes.ENUM,
values: [
"forecast_change",
"signal_change",
"volatility_spike",
"data_quality",
"model_health",
"geopolitical_severity"
],
},
severity: {
type: DataTypes.ENUM,
values: [
"info",
"warning",
"critical"
],
},
delivery_channel: {
type: DataTypes.ENUM,
values: [
"in_app",
"email",
"webhook"
],
},
is_enabled: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
threshold_value: {
type: DataTypes.DECIMAL,
},
rule_description: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
alerts.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.alerts.hasMany(db.alert_events, {
as: 'alert_events_alert',
foreignKey: {
name: 'alertId',
},
constraints: false,
});
//end loop
db.alerts.belongsTo(db.assets, {
as: 'target_asset',
foreignKey: {
name: 'target_assetId',
},
constraints: false,
});
db.alerts.belongsTo(db.models, {
as: 'target_model',
foreignKey: {
name: 'target_modelId',
},
constraints: false,
});
db.alerts.belongsTo(db.users, {
as: 'owner',
foreignKey: {
name: 'ownerId',
},
constraints: false,
});
db.alerts.belongsTo(db.users, {
as: 'createdBy',
});
db.alerts.belongsTo(db.users, {
as: 'updatedBy',
});
};
return alerts;
};

View File

@ -0,0 +1,142 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const api_keys = sequelize.define(
'api_keys',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
scope: {
type: DataTypes.ENUM,
values: [
"read_only",
"read_forecasts",
"read_analytics",
"admin"
],
},
is_active: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
expires_at: {
type: DataTypes.DATE,
},
key_fingerprint: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
api_keys.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.api_keys.belongsTo(db.users, {
as: 'owner',
foreignKey: {
name: 'ownerId',
},
constraints: false,
});
db.api_keys.belongsTo(db.users, {
as: 'createdBy',
});
db.api_keys.belongsTo(db.users, {
as: 'updatedBy',
});
};
return api_keys;
};

View File

@ -0,0 +1,221 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const assets = sequelize.define(
'assets',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
symbol: {
type: DataTypes.TEXT,
},
name: {
type: DataTypes.TEXT,
},
asset_class: {
type: DataTypes.ENUM,
values: [
"commodity_spot",
"commodity_future",
"equity",
"equity_index",
"fx",
"rate",
"macro_series",
"risk_index",
"sentiment_index",
"energy",
"crypto",
"other"
],
},
currency: {
type: DataTypes.TEXT,
},
exchange_venue: {
type: DataTypes.TEXT,
},
is_active: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
assets.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.assets.hasMany(db.time_series, {
as: 'time_series_asset',
foreignKey: {
name: 'assetId',
},
constraints: false,
});
db.assets.hasMany(db.mining_companies, {
as: 'mining_companies_equity_asset',
foreignKey: {
name: 'equity_assetId',
},
constraints: false,
});
db.assets.hasMany(db.macro_indicators, {
as: 'macro_indicators_series_asset',
foreignKey: {
name: 'series_assetId',
},
constraints: false,
});
db.assets.hasMany(db.forecasts, {
as: 'forecasts_target_asset',
foreignKey: {
name: 'target_assetId',
},
constraints: false,
});
db.assets.hasMany(db.scenario_shocks, {
as: 'scenario_shocks_asset',
foreignKey: {
name: 'assetId',
},
constraints: false,
});
db.assets.hasMany(db.alerts, {
as: 'alerts_target_asset',
foreignKey: {
name: 'target_assetId',
},
constraints: false,
});
//end loop
db.assets.belongsTo(db.data_sources, {
as: 'primary_data_source',
foreignKey: {
name: 'primary_data_sourceId',
},
constraints: false,
});
db.assets.belongsTo(db.users, {
as: 'createdBy',
});
db.assets.belongsTo(db.users, {
as: 'updatedBy',
});
};
return assets;
};

View File

@ -0,0 +1,183 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const audit_events = sequelize.define(
'audit_events',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
occurred_at: {
type: DataTypes.DATE,
},
event_type: {
type: DataTypes.ENUM,
values: [
"auth_login",
"auth_logout",
"data_ingestion",
"data_quality",
"model_change",
"model_run",
"forecast_publish",
"scenario_change",
"permission_change",
"api_access",
"export"
],
},
outcome: {
type: DataTypes.ENUM,
values: [
"success",
"failure"
],
},
resource_type: {
type: DataTypes.TEXT,
},
resource_identifier: {
type: DataTypes.TEXT,
},
details: {
type: DataTypes.TEXT,
},
ip_address: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
audit_events.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.audit_events.belongsTo(db.users, {
as: 'actor',
foreignKey: {
name: 'actorId',
},
constraints: false,
});
db.audit_events.belongsTo(db.users, {
as: 'createdBy',
});
db.audit_events.belongsTo(db.users, {
as: 'updatedBy',
});
};
return audit_events;
};

View File

@ -0,0 +1,243 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const data_sources = sequelize.define(
'data_sources',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
category: {
type: DataTypes.ENUM,
values: [
"spot_prices",
"gold_miners",
"macroeconomic",
"geopolitical",
"additional_variables"
],
},
ingestion_mode: {
type: DataTypes.ENUM,
values: [
"streaming",
"batch",
"hybrid"
],
},
connection_type: {
type: DataTypes.TEXT,
},
coverage_description: {
type: DataTypes.TEXT,
},
refresh_rate: {
type: DataTypes.ENUM,
values: [
"real_time",
"minute",
"hourly",
"daily",
"weekly",
"monthly",
"quarterly",
"event_driven"
],
},
is_enabled: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
last_success_at: {
type: DataTypes.DATE,
},
last_failure_at: {
type: DataTypes.DATE,
},
last_failure_reason: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
data_sources.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.data_sources.hasMany(db.assets, {
as: 'assets_primary_data_source',
foreignKey: {
name: 'primary_data_sourceId',
},
constraints: false,
});
db.data_sources.hasMany(db.macro_indicators, {
as: 'macro_indicators_data_source',
foreignKey: {
name: 'data_sourceId',
},
constraints: false,
});
db.data_sources.hasMany(db.geopolitical_events, {
as: 'geopolitical_events_data_source',
foreignKey: {
name: 'data_sourceId',
},
constraints: false,
});
db.data_sources.hasMany(db.geopolitical_scores, {
as: 'geopolitical_scores_data_source',
foreignKey: {
name: 'data_sourceId',
},
constraints: false,
});
//end loop
db.data_sources.belongsTo(db.users, {
as: 'createdBy',
});
db.data_sources.belongsTo(db.users, {
as: 'updatedBy',
});
};
return data_sources;
};

View File

@ -0,0 +1,148 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const factor_attributions = sequelize.define(
'factor_attributions',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
factor_name: {
type: DataTypes.TEXT,
},
factor_category: {
type: DataTypes.ENUM,
values: [
"spot_prices",
"gold_miners",
"macroeconomic",
"geopolitical",
"additional_variables",
"technical",
"model_internal"
],
},
contribution: {
type: DataTypes.DECIMAL,
},
importance: {
type: DataTypes.DECIMAL,
},
notes: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
factor_attributions.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.factor_attributions.belongsTo(db.forecasts, {
as: 'forecast',
foreignKey: {
name: 'forecastId',
},
constraints: false,
});
db.factor_attributions.belongsTo(db.users, {
as: 'createdBy',
});
db.factor_attributions.belongsTo(db.users, {
as: 'updatedBy',
});
};
return factor_attributions;
};

View File

@ -0,0 +1,175 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const feature_sets = sequelize.define(
'feature_sets',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
description: {
type: DataTypes.TEXT,
},
version_stage: {
type: DataTypes.ENUM,
values: [
"research",
"staging",
"production",
"deprecated"
],
},
effective_from_at: {
type: DataTypes.DATE,
},
effective_to_at: {
type: DataTypes.DATE,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
feature_sets.associate = (db) => {
db.feature_sets.belongsToMany(db.assets, {
as: 'input_assets',
foreignKey: {
name: 'feature_sets_input_assetsId',
},
constraints: false,
through: 'feature_setsInput_assetsAssets',
});
db.feature_sets.belongsToMany(db.assets, {
as: 'input_assets_filter',
foreignKey: {
name: 'feature_sets_input_assetsId',
},
constraints: false,
through: 'feature_setsInput_assetsAssets',
});
db.feature_sets.belongsToMany(db.macro_indicators, {
as: 'macro_indicators',
foreignKey: {
name: 'feature_sets_macro_indicatorsId',
},
constraints: false,
through: 'feature_setsMacro_indicatorsMacro_indicators',
});
db.feature_sets.belongsToMany(db.macro_indicators, {
as: 'macro_indicators_filter',
foreignKey: {
name: 'feature_sets_macro_indicatorsId',
},
constraints: false,
through: 'feature_setsMacro_indicatorsMacro_indicators',
});
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.feature_sets.hasMany(db.models, {
as: 'models_feature_set',
foreignKey: {
name: 'feature_setId',
},
constraints: false,
});
//end loop
db.feature_sets.belongsTo(db.users, {
as: 'createdBy',
});
db.feature_sets.belongsTo(db.users, {
as: 'updatedBy',
});
};
return feature_sets;
};

View File

@ -0,0 +1,53 @@
module.exports = function(sequelize, DataTypes) {
const file = sequelize.define(
'file',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
belongsTo: DataTypes.STRING(255),
belongsToId: DataTypes.UUID,
belongsToColumn: DataTypes.STRING(255),
name: {
type: DataTypes.STRING(2083),
allowNull: false,
validate: {
notEmpty: true,
},
},
sizeInBytes: {
type: DataTypes.INTEGER,
allowNull: true,
},
privateUrl: {
type: DataTypes.STRING(2083),
allowNull: true,
},
publicUrl: {
type: DataTypes.STRING(2083),
allowNull: false,
validate: {
notEmpty: true,
},
},
},
{
timestamps: true,
paranoid: true,
},
);
file.associate = (db) => {
db.file.belongsTo(db.users, {
as: 'createdBy',
});
db.file.belongsTo(db.users, {
as: 'updatedBy',
});
};
return file;
};

View File

@ -0,0 +1,226 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const forecasts = sequelize.define(
'forecasts',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
as_of_at: {
type: DataTypes.DATE,
},
horizon: {
type: DataTypes.ENUM,
values: [
"daily",
"weekly",
"monthly",
"quarterly",
"annual"
],
},
target_time_at: {
type: DataTypes.DATE,
},
point_estimate: {
type: DataTypes.DECIMAL,
},
p10: {
type: DataTypes.DECIMAL,
},
p50: {
type: DataTypes.DECIMAL,
},
p90: {
type: DataTypes.DECIMAL,
},
volatility_forecast: {
type: DataTypes.DECIMAL,
},
signal_direction: {
type: DataTypes.ENUM,
values: [
"strong_buy",
"buy",
"neutral",
"sell",
"strong_sell"
],
},
signal_confidence: {
type: DataTypes.DECIMAL,
},
explainability_summary: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
forecasts.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.forecasts.hasMany(db.factor_attributions, {
as: 'factor_attributions_forecast',
foreignKey: {
name: 'forecastId',
},
constraints: false,
});
db.forecasts.hasMany(db.scenario_results, {
as: 'scenario_results_forecast',
foreignKey: {
name: 'forecastId',
},
constraints: false,
});
//end loop
db.forecasts.belongsTo(db.models, {
as: 'model',
foreignKey: {
name: 'modelId',
},
constraints: false,
});
db.forecasts.belongsTo(db.assets, {
as: 'target_asset',
foreignKey: {
name: 'target_assetId',
},
constraints: false,
});
db.forecasts.belongsTo(db.users, {
as: 'createdBy',
});
db.forecasts.belongsTo(db.users, {
as: 'updatedBy',
});
};
return forecasts;
};

View File

@ -0,0 +1,226 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const geopolitical_events = sequelize.define(
'geopolitical_events',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
title: {
type: DataTypes.TEXT,
},
focus_area: {
type: DataTypes.ENUM,
values: [
"iran_conflict",
"middle_east",
"europe",
"asia_pacific",
"americas",
"global"
],
},
event_type: {
type: DataTypes.ENUM,
values: [
"military",
"sanctions",
"diplomatic",
"election",
"terrorism",
"shipping_disruption",
"energy_shock",
"cyber",
"other"
],
},
severity: {
type: DataTypes.ENUM,
values: [
"low",
"medium",
"high",
"critical"
],
},
event_start_at: {
type: DataTypes.DATE,
},
event_end_at: {
type: DataTypes.DATE,
},
summary: {
type: DataTypes.TEXT,
},
source_summary: {
type: DataTypes.TEXT,
},
confidence_score: {
type: DataTypes.DECIMAL,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
geopolitical_events.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.geopolitical_events.hasMany(db.geopolitical_scores, {
as: 'geopolitical_scores_related_event',
foreignKey: {
name: 'related_eventId',
},
constraints: false,
});
//end loop
db.geopolitical_events.belongsTo(db.data_sources, {
as: 'data_source',
foreignKey: {
name: 'data_sourceId',
},
constraints: false,
});
db.geopolitical_events.belongsTo(db.users, {
as: 'createdBy',
});
db.geopolitical_events.belongsTo(db.users, {
as: 'updatedBy',
});
};
return geopolitical_events;
};

View File

@ -0,0 +1,169 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const geopolitical_scores = sequelize.define(
'geopolitical_scores',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
as_of_at: {
type: DataTypes.DATE,
},
score_type: {
type: DataTypes.ENUM,
values: [
"news_sentiment",
"event_intensity",
"expert_overlay",
"composite_risk"
],
},
horizon: {
type: DataTypes.ENUM,
values: [
"intraday",
"daily",
"weekly",
"monthly"
],
},
score_value: {
type: DataTypes.DECIMAL,
},
iran_conflict_weight: {
type: DataTypes.DECIMAL,
},
methodology_note: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
geopolitical_scores.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.geopolitical_scores.belongsTo(db.geopolitical_events, {
as: 'related_event',
foreignKey: {
name: 'related_eventId',
},
constraints: false,
});
db.geopolitical_scores.belongsTo(db.data_sources, {
as: 'data_source',
foreignKey: {
name: 'data_sourceId',
},
constraints: false,
});
db.geopolitical_scores.belongsTo(db.users, {
as: 'createdBy',
});
db.geopolitical_scores.belongsTo(db.users, {
as: 'updatedBy',
});
};
return geopolitical_scores;
};

View File

@ -0,0 +1,38 @@
'use strict';
const fs = require('fs');
const path = require('path');
const Sequelize = require('sequelize');
const basename = path.basename(__filename);
const env = process.env.NODE_ENV || 'development';
const config = require("../db.config")[env];
const db = {};
let sequelize;
console.log(env);
if (config.use_env_variable) {
sequelize = new Sequelize(process.env[config.use_env_variable], config);
} else {
sequelize = new Sequelize(config.database, config.username, config.password, config);
}
fs
.readdirSync(__dirname)
.filter(file => {
return (file.indexOf('.') !== 0) && (file !== basename) && (file.slice(-3) === '.js');
})
.forEach(file => {
const model = require(path.join(__dirname, file))(sequelize, Sequelize.DataTypes)
db[model.name] = model;
});
Object.keys(db).forEach(modelName => {
if (db[modelName].associate) {
db[modelName].associate(db);
}
});
db.sequelize = sequelize;
db.Sequelize = Sequelize;
module.exports = db;

View File

@ -0,0 +1,200 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const macro_indicators = sequelize.define(
'macro_indicators',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
code: {
type: DataTypes.TEXT,
},
name: {
type: DataTypes.TEXT,
},
indicator_type: {
type: DataTypes.ENUM,
values: [
"interest_rate",
"inflation",
"currency_strength",
"gdp",
"employment",
"pmi",
"liquidity",
"credit_spread",
"real_rate",
"other"
],
},
frequency: {
type: DataTypes.ENUM,
values: [
"daily",
"weekly",
"monthly",
"quarterly",
"annual"
],
},
unit: {
type: DataTypes.TEXT,
},
region: {
type: DataTypes.TEXT,
},
is_active: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
macro_indicators.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.macro_indicators.belongsTo(db.data_sources, {
as: 'data_source',
foreignKey: {
name: 'data_sourceId',
},
constraints: false,
});
db.macro_indicators.belongsTo(db.assets, {
as: 'series_asset',
foreignKey: {
name: 'series_assetId',
},
constraints: false,
});
db.macro_indicators.belongsTo(db.users, {
as: 'createdBy',
});
db.macro_indicators.belongsTo(db.users, {
as: 'updatedBy',
});
};
return macro_indicators;
};

View File

@ -0,0 +1,135 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const mining_companies = sequelize.define(
'mining_companies',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
ticker: {
type: DataTypes.TEXT,
},
company_name: {
type: DataTypes.TEXT,
},
country: {
type: DataTypes.TEXT,
},
primary_mines: {
type: DataTypes.TEXT,
},
is_major_producer: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
mining_companies.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.mining_companies.hasMany(db.mining_fundamentals, {
as: 'mining_fundamentals_mining_company',
foreignKey: {
name: 'mining_companyId',
},
constraints: false,
});
//end loop
db.mining_companies.belongsTo(db.assets, {
as: 'equity_asset',
foreignKey: {
name: 'equity_assetId',
},
constraints: false,
});
db.mining_companies.belongsTo(db.users, {
as: 'createdBy',
});
db.mining_companies.belongsTo(db.users, {
as: 'updatedBy',
});
};
return mining_companies;
};

View File

@ -0,0 +1,182 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const mining_fundamentals = sequelize.define(
'mining_fundamentals',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
reporting_period: {
type: DataTypes.ENUM,
values: [
"quarterly",
"annual"
],
},
period_end_at: {
type: DataTypes.DATE,
},
production_oz: {
type: DataTypes.DECIMAL,
},
all_in_sustaining_cost: {
type: DataTypes.DECIMAL,
},
cash_cost: {
type: DataTypes.DECIMAL,
},
reserves_oz: {
type: DataTypes.DECIMAL,
},
revenue: {
type: DataTypes.DECIMAL,
},
ebitda: {
type: DataTypes.DECIMAL,
},
free_cash_flow: {
type: DataTypes.DECIMAL,
},
debt_to_equity: {
type: DataTypes.DECIMAL,
},
operating_margin: {
type: DataTypes.DECIMAL,
},
notes: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
mining_fundamentals.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.mining_fundamentals.belongsTo(db.mining_companies, {
as: 'mining_company',
foreignKey: {
name: 'mining_companyId',
},
constraints: false,
});
db.mining_fundamentals.belongsTo(db.users, {
as: 'createdBy',
});
db.mining_fundamentals.belongsTo(db.users, {
as: 'updatedBy',
});
};
return mining_fundamentals;
};

View File

@ -0,0 +1,174 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const model_runs = sequelize.define(
'model_runs',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
run_type: {
type: DataTypes.ENUM,
values: [
"training",
"backtest",
"walk_forward",
"stress_test",
"inference_validation"
],
},
run_status: {
type: DataTypes.ENUM,
values: [
"queued",
"running",
"succeeded",
"failed",
"canceled"
],
},
started_at: {
type: DataTypes.DATE,
},
ended_at: {
type: DataTypes.DATE,
},
data_window: {
type: DataTypes.TEXT,
},
metrics_summary: {
type: DataTypes.TEXT,
},
error_details: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
model_runs.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.model_runs.belongsTo(db.models, {
as: 'model',
foreignKey: {
name: 'modelId',
},
constraints: false,
});
db.model_runs.belongsTo(db.users, {
as: 'createdBy',
});
db.model_runs.belongsTo(db.users, {
as: 'updatedBy',
});
};
return model_runs;
};

View File

@ -0,0 +1,226 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const models = sequelize.define(
'models',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
model_family: {
type: DataTypes.ENUM,
values: [
"time_series",
"traditional_ml",
"deep_learning",
"bayesian",
"hybrid"
],
},
training_mode: {
type: DataTypes.ENUM,
values: [
"batch",
"incremental"
],
},
status: {
type: DataTypes.ENUM,
values: [
"research",
"staging",
"production",
"retired"
],
},
objective: {
type: DataTypes.TEXT,
},
target_metric_value: {
type: DataTypes.DECIMAL,
},
last_trained_at: {
type: DataTypes.DATE,
},
artifact_uri: {
type: DataTypes.TEXT,
},
config_snapshot: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
models.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.models.hasMany(db.model_runs, {
as: 'model_runs_model',
foreignKey: {
name: 'modelId',
},
constraints: false,
});
db.models.hasMany(db.forecasts, {
as: 'forecasts_model',
foreignKey: {
name: 'modelId',
},
constraints: false,
});
db.models.hasMany(db.alerts, {
as: 'alerts_target_model',
foreignKey: {
name: 'target_modelId',
},
constraints: false,
});
//end loop
db.models.belongsTo(db.feature_sets, {
as: 'feature_set',
foreignKey: {
name: 'feature_setId',
},
constraints: false,
});
db.models.belongsTo(db.users, {
as: 'owner',
foreignKey: {
name: 'ownerId',
},
constraints: false,
});
db.models.belongsTo(db.users, {
as: 'createdBy',
});
db.models.belongsTo(db.users, {
as: 'updatedBy',
});
};
return models;
};

View File

@ -0,0 +1,88 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const permissions = sequelize.define(
'permissions',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
permissions.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.permissions.belongsTo(db.users, {
as: 'createdBy',
});
db.permissions.belongsTo(db.users, {
as: 'updatedBy',
});
};
return permissions;
};

View File

@ -0,0 +1,121 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const roles = sequelize.define(
'roles',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
role_customization: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
roles.associate = (db) => {
db.roles.belongsToMany(db.permissions, {
as: 'permissions',
foreignKey: {
name: 'roles_permissionsId',
},
constraints: false,
through: 'rolesPermissionsPermissions',
});
db.roles.belongsToMany(db.permissions, {
as: 'permissions_filter',
foreignKey: {
name: 'roles_permissionsId',
},
constraints: false,
through: 'rolesPermissionsPermissions',
});
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.roles.hasMany(db.users, {
as: 'users_app_role',
foreignKey: {
name: 'app_roleId',
},
constraints: false,
});
//end loop
db.roles.belongsTo(db.users, {
as: 'createdBy',
});
db.roles.belongsTo(db.users, {
as: 'updatedBy',
});
};
return roles;
};

View File

@ -0,0 +1,132 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const scenario_results = sequelize.define(
'scenario_results',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
delta_point_estimate: {
type: DataTypes.DECIMAL,
},
delta_p50: {
type: DataTypes.DECIMAL,
},
delta_volatility: {
type: DataTypes.DECIMAL,
},
pnl_proxy: {
type: DataTypes.DECIMAL,
},
narrative: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
scenario_results.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.scenario_results.belongsTo(db.scenarios, {
as: 'scenario',
foreignKey: {
name: 'scenarioId',
},
constraints: false,
});
db.scenario_results.belongsTo(db.forecasts, {
as: 'forecast',
foreignKey: {
name: 'forecastId',
},
constraints: false,
});
db.scenario_results.belongsTo(db.users, {
as: 'createdBy',
});
db.scenario_results.belongsTo(db.users, {
as: 'updatedBy',
});
};
return scenario_results;
};

View File

@ -0,0 +1,144 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const scenario_shocks = sequelize.define(
'scenario_shocks',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
shock_style: {
type: DataTypes.ENUM,
values: [
"absolute",
"percent",
"path"
],
},
shock_value: {
type: DataTypes.DECIMAL,
},
shock_start_at: {
type: DataTypes.DATE,
},
shock_end_at: {
type: DataTypes.DATE,
},
shock_path_note: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
scenario_shocks.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.scenario_shocks.belongsTo(db.scenarios, {
as: 'scenario',
foreignKey: {
name: 'scenarioId',
},
constraints: false,
});
db.scenario_shocks.belongsTo(db.assets, {
as: 'asset',
foreignKey: {
name: 'assetId',
},
constraints: false,
});
db.scenario_shocks.belongsTo(db.users, {
as: 'createdBy',
});
db.scenario_shocks.belongsTo(db.users, {
as: 'updatedBy',
});
};
return scenario_shocks;
};

View File

@ -0,0 +1,193 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const scenarios = sequelize.define(
'scenarios',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
scenario_type: {
type: DataTypes.ENUM,
values: [
"macro_shock",
"geopolitical_shock",
"rates_shift",
"inflation_shift",
"fx_shift",
"energy_shock",
"central_bank_activity",
"custom"
],
},
status: {
type: DataTypes.ENUM,
values: [
"draft",
"approved",
"archived"
],
},
description: {
type: DataTypes.TEXT,
},
valid_from_at: {
type: DataTypes.DATE,
},
valid_to_at: {
type: DataTypes.DATE,
},
probability_weight: {
type: DataTypes.DECIMAL,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
scenarios.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.scenarios.hasMany(db.scenario_shocks, {
as: 'scenario_shocks_scenario',
foreignKey: {
name: 'scenarioId',
},
constraints: false,
});
db.scenarios.hasMany(db.scenario_results, {
as: 'scenario_results_scenario',
foreignKey: {
name: 'scenarioId',
},
constraints: false,
});
//end loop
db.scenarios.belongsTo(db.users, {
as: 'owner',
foreignKey: {
name: 'ownerId',
},
constraints: false,
});
db.scenarios.belongsTo(db.users, {
as: 'createdBy',
});
db.scenarios.belongsTo(db.users, {
as: 'updatedBy',
});
};
return scenarios;
};

View File

@ -0,0 +1,207 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const time_series = sequelize.define(
'time_series',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
timestamp: {
type: DataTypes.DATE,
},
bar_size: {
type: DataTypes.ENUM,
values: [
"tick",
"1m",
"5m",
"15m",
"1h",
"1d",
"1w",
"1mo",
"1q",
"1y"
],
},
open: {
type: DataTypes.DECIMAL,
},
high: {
type: DataTypes.DECIMAL,
},
low: {
type: DataTypes.DECIMAL,
},
close: {
type: DataTypes.DECIMAL,
},
volume: {
type: DataTypes.DECIMAL,
},
vwap: {
type: DataTypes.DECIMAL,
},
is_imputed: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
quality_flag: {
type: DataTypes.ENUM,
values: [
"ok",
"warning",
"bad"
],
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
time_series.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.time_series.belongsTo(db.assets, {
as: 'asset',
foreignKey: {
name: 'assetId',
},
constraints: false,
});
db.time_series.belongsTo(db.users, {
as: 'createdBy',
});
db.time_series.belongsTo(db.users, {
as: 'updatedBy',
});
};
return time_series;
};

View File

@ -0,0 +1,294 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const users = sequelize.define(
'users',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
firstName: {
type: DataTypes.TEXT,
},
lastName: {
type: DataTypes.TEXT,
},
phoneNumber: {
type: DataTypes.TEXT,
},
email: {
type: DataTypes.TEXT,
},
disabled: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
password: {
type: DataTypes.TEXT,
},
emailVerified: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
emailVerificationToken: {
type: DataTypes.TEXT,
},
emailVerificationTokenExpiresAt: {
type: DataTypes.DATE,
},
passwordResetToken: {
type: DataTypes.TEXT,
},
passwordResetTokenExpiresAt: {
type: DataTypes.DATE,
},
provider: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
users.associate = (db) => {
db.users.belongsToMany(db.permissions, {
as: 'custom_permissions',
foreignKey: {
name: 'users_custom_permissionsId',
},
constraints: false,
through: 'usersCustom_permissionsPermissions',
});
db.users.belongsToMany(db.permissions, {
as: 'custom_permissions_filter',
foreignKey: {
name: 'users_custom_permissionsId',
},
constraints: false,
through: 'usersCustom_permissionsPermissions',
});
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.users.hasMany(db.models, {
as: 'models_owner',
foreignKey: {
name: 'ownerId',
},
constraints: false,
});
db.users.hasMany(db.scenarios, {
as: 'scenarios_owner',
foreignKey: {
name: 'ownerId',
},
constraints: false,
});
db.users.hasMany(db.alerts, {
as: 'alerts_owner',
foreignKey: {
name: 'ownerId',
},
constraints: false,
});
db.users.hasMany(db.alert_events, {
as: 'alert_events_acknowledged_by',
foreignKey: {
name: 'acknowledged_byId',
},
constraints: false,
});
db.users.hasMany(db.api_keys, {
as: 'api_keys_owner',
foreignKey: {
name: 'ownerId',
},
constraints: false,
});
db.users.hasMany(db.audit_events, {
as: 'audit_events_actor',
foreignKey: {
name: 'actorId',
},
constraints: false,
});
//end loop
db.users.belongsTo(db.roles, {
as: 'app_role',
foreignKey: {
name: 'app_roleId',
},
constraints: false,
});
db.users.hasMany(db.file, {
as: 'avatar',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.users.getTableName(),
belongsToColumn: 'avatar',
},
});
db.users.belongsTo(db.users, {
as: 'createdBy',
});
db.users.belongsTo(db.users, {
as: 'updatedBy',
});
};
users.beforeCreate((users, options) => {
users = trimStringFields(users);
if (users.provider !== providers.LOCAL && Object.values(providers).indexOf(users.provider) > -1) {
users.emailVerified = true;
if (!users.password) {
const password = crypto
.randomBytes(20)
.toString('hex');
const hashedPassword = bcrypt.hashSync(
password,
config.bcrypt.saltRounds,
);
users.password = hashedPassword
}
}
});
users.beforeUpdate((users, options) => {
users = trimStringFields(users);
});
return users;
};
function trimStringFields(users) {
users.email = users.email.trim();
users.firstName = users.firstName
? users.firstName.trim()
: null;
users.lastName = users.lastName
? users.lastName.trim()
: null;
return users;
}

16
backend/src/db/reset.js Normal file
View File

@ -0,0 +1,16 @@
const db = require('./models');
const {execSync} = require("child_process");
console.log('Resetting Database');
db.sequelize
.sync({ force: true })
.then(() => {
execSync("sequelize db:seed:all");
console.log('OK');
process.exit();
})
.catch((error) => {
console.error(error);
process.exit(1);
});

View File

@ -0,0 +1,66 @@
'use strict';
const bcrypt = require("bcrypt");
const config = require("../../config");
const ids = [
'193bf4b5-9f07-4bd5-9a43-e7e41f3e96af',
'af5a87be-8f9c-4630-902a-37a60b7005ba',
'5bc531ab-611f-41f3-9373-b7cc5d09c93d',
]
module.exports = {
up: async (queryInterface, Sequelize) => {
let admin_hash = bcrypt.hashSync(config.admin_pass, config.bcrypt.saltRounds);
let user_hash = bcrypt.hashSync(config.user_pass, config.bcrypt.saltRounds);
try {
await queryInterface.bulkInsert('users', [
{
id: ids[0],
firstName: 'Admin',
email: config.admin_email,
emailVerified: true,
provider: config.providers.LOCAL,
password: admin_hash,
createdAt: new Date(),
updatedAt: new Date()
},
{
id: ids[1],
firstName: 'John',
email: 'john@doe.com',
emailVerified: true,
provider: config.providers.LOCAL,
password: user_hash,
createdAt: new Date(),
updatedAt: new Date()
},
{
id: ids[2],
firstName: 'Client',
email: 'client@hello.com',
emailVerified: true,
provider: config.providers.LOCAL,
password: user_hash,
createdAt: new Date(),
updatedAt: new Date()
},
]);
} catch (error) {
console.error('Error during bulkInsert:', error);
throw error;
}
},
down: async (queryInterface, Sequelize) => {
try {
await queryInterface.bulkDelete('users', {
id: {
[Sequelize.Op.in]: ids,
},
}, {});
} catch (error) {
console.error('Error during bulkDelete:', error);
throw error;
}
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

27
backend/src/db/utils.js Normal file
View File

@ -0,0 +1,27 @@
const validator = require('validator');
const { v4: uuid } = require('uuid');
const Sequelize = require('./models').Sequelize;
module.exports = class Utils {
static uuid(value) {
let id = value;
if (!validator.isUUID(id)) {
id = uuid();
}
return id;
}
static ilike(model, column, value) {
return Sequelize.where(
Sequelize.fn(
'lower',
Sequelize.col(`${model}.${column}`),
),
{
[Sequelize.Op.like]: `%${value}%`.toLowerCase(),
},
);
}
};

23
backend/src/helpers.js Normal file
View File

@ -0,0 +1,23 @@
const jwt = require('jsonwebtoken');
const config = require('./config');
module.exports = class Helpers {
static wrapAsync(fn) {
return function (req, res, next) {
fn(req, res, next).catch(next);
};
}
static commonErrorHandler(error, req, res, next) {
if ([400, 403, 404].includes(error.code)) {
return res.status(error.code).send(error.message);
}
console.error(error);
return res.status(500).send(error.message);
}
static jwtSign(data) {
return jwt.sign(data, config.secret_key, {expiresIn: '6h'});
};
};

216
backend/src/index.js Normal file
View File

@ -0,0 +1,216 @@
const express = require('express');
const cors = require('cors');
const app = express();
const passport = require('passport');
const path = require('path');
const fs = require('fs');
const bodyParser = require('body-parser');
const db = require('./db/models');
const config = require('./config');
const swaggerUI = require('swagger-ui-express');
const swaggerJsDoc = require('swagger-jsdoc');
const authRoutes = require('./routes/auth');
const fileRoutes = require('./routes/file');
const searchRoutes = require('./routes/search');
const sqlRoutes = require('./routes/sql');
const pexelsRoutes = require('./routes/pexels');
const openaiRoutes = require('./routes/openai');
const usersRoutes = require('./routes/users');
const rolesRoutes = require('./routes/roles');
const permissionsRoutes = require('./routes/permissions');
const data_sourcesRoutes = require('./routes/data_sources');
const assetsRoutes = require('./routes/assets');
const time_seriesRoutes = require('./routes/time_series');
const mining_companiesRoutes = require('./routes/mining_companies');
const mining_fundamentalsRoutes = require('./routes/mining_fundamentals');
const macro_indicatorsRoutes = require('./routes/macro_indicators');
const geopolitical_eventsRoutes = require('./routes/geopolitical_events');
const geopolitical_scoresRoutes = require('./routes/geopolitical_scores');
const feature_setsRoutes = require('./routes/feature_sets');
const modelsRoutes = require('./routes/models');
const model_runsRoutes = require('./routes/model_runs');
const forecastsRoutes = require('./routes/forecasts');
const factor_attributionsRoutes = require('./routes/factor_attributions');
const scenariosRoutes = require('./routes/scenarios');
const scenario_shocksRoutes = require('./routes/scenario_shocks');
const scenario_resultsRoutes = require('./routes/scenario_results');
const alertsRoutes = require('./routes/alerts');
const alert_eventsRoutes = require('./routes/alert_events');
const api_keysRoutes = require('./routes/api_keys');
const audit_eventsRoutes = require('./routes/audit_events');
const getBaseUrl = (url) => {
if (!url) return '';
return url.endsWith('/api') ? url.slice(0, -4) : url;
};
const options = {
definition: {
openapi: "3.0.0",
info: {
version: "1.0.0",
title: "Gold Forecasting Engine",
description: "Gold Forecasting Engine Online REST API for Testing and Prototyping application. You can perform all major operations with your entities - create, delete and etc.",
},
servers: [
{
url: getBaseUrl(process.env.NEXT_PUBLIC_BACK_API) || config.swaggerUrl,
description: "Development server",
}
],
components: {
securitySchemes: {
bearerAuth: {
type: 'http',
scheme: 'bearer',
bearerFormat: 'JWT',
}
},
responses: {
UnauthorizedError: {
description: "Access token is missing or invalid"
}
}
},
security: [{
bearerAuth: []
}]
},
apis: ["./src/routes/*.js"],
};
const specs = swaggerJsDoc(options);
app.use('/api-docs', function (req, res, next) {
swaggerUI.host = getBaseUrl(process.env.NEXT_PUBLIC_BACK_API) || req.get('host');
next()
}, swaggerUI.serve, swaggerUI.setup(specs))
app.use(cors({origin: true}));
require('./auth/auth');
app.use(bodyParser.json());
app.use('/api/auth', authRoutes);
app.use('/api/file', fileRoutes);
app.use('/api/pexels', pexelsRoutes);
app.enable('trust proxy');
app.use('/api/users', passport.authenticate('jwt', {session: false}), usersRoutes);
app.use('/api/roles', passport.authenticate('jwt', {session: false}), rolesRoutes);
app.use('/api/permissions', passport.authenticate('jwt', {session: false}), permissionsRoutes);
app.use('/api/data_sources', passport.authenticate('jwt', {session: false}), data_sourcesRoutes);
app.use('/api/assets', passport.authenticate('jwt', {session: false}), assetsRoutes);
app.use('/api/time_series', passport.authenticate('jwt', {session: false}), time_seriesRoutes);
app.use('/api/mining_companies', passport.authenticate('jwt', {session: false}), mining_companiesRoutes);
app.use('/api/mining_fundamentals', passport.authenticate('jwt', {session: false}), mining_fundamentalsRoutes);
app.use('/api/macro_indicators', passport.authenticate('jwt', {session: false}), macro_indicatorsRoutes);
app.use('/api/geopolitical_events', passport.authenticate('jwt', {session: false}), geopolitical_eventsRoutes);
app.use('/api/geopolitical_scores', passport.authenticate('jwt', {session: false}), geopolitical_scoresRoutes);
app.use('/api/feature_sets', passport.authenticate('jwt', {session: false}), feature_setsRoutes);
app.use('/api/models', passport.authenticate('jwt', {session: false}), modelsRoutes);
app.use('/api/model_runs', passport.authenticate('jwt', {session: false}), model_runsRoutes);
app.use('/api/forecasts', passport.authenticate('jwt', {session: false}), forecastsRoutes);
app.use('/api/factor_attributions', passport.authenticate('jwt', {session: false}), factor_attributionsRoutes);
app.use('/api/scenarios', passport.authenticate('jwt', {session: false}), scenariosRoutes);
app.use('/api/scenario_shocks', passport.authenticate('jwt', {session: false}), scenario_shocksRoutes);
app.use('/api/scenario_results', passport.authenticate('jwt', {session: false}), scenario_resultsRoutes);
app.use('/api/alerts', passport.authenticate('jwt', {session: false}), alertsRoutes);
app.use('/api/alert_events', passport.authenticate('jwt', {session: false}), alert_eventsRoutes);
app.use('/api/api_keys', passport.authenticate('jwt', {session: false}), api_keysRoutes);
app.use('/api/audit_events', passport.authenticate('jwt', {session: false}), audit_eventsRoutes);
app.use(
'/api/openai',
passport.authenticate('jwt', { session: false }),
openaiRoutes,
);
app.use(
'/api/ai',
passport.authenticate('jwt', { session: false }),
openaiRoutes,
);
app.use(
'/api/search',
passport.authenticate('jwt', { session: false }),
searchRoutes);
app.use(
'/api/sql',
passport.authenticate('jwt', { session: false }),
sqlRoutes);
const publicDir = path.join(
__dirname,
'../public',
);
if (fs.existsSync(publicDir)) {
app.use('/', express.static(publicDir));
app.get('*', function(request, response) {
response.sendFile(
path.resolve(publicDir, 'index.html'),
);
});
}
const PORT = process.env.NODE_ENV === 'dev_stage' ? 3000 : 8080;
app.listen(PORT, () => {
console.log(`Listening on port ${PORT}`);
});
module.exports = app;

View File

@ -0,0 +1,149 @@
const ValidationError = require('../services/notifications/errors/validation');
const RolesDBApi = require('../db/api/roles');
// Cache for the 'Public' role object
let publicRoleCache = null;
// Function to asynchronously fetch and cache the 'Public' role
async function fetchAndCachePublicRole() {
try {
// Use RolesDBApi to find the role by name 'Public'
publicRoleCache = await RolesDBApi.findBy({ name: 'Public' });
if (!publicRoleCache) {
console.error("WARNING: Role 'Public' not found in database during middleware startup. Check your migrations.");
// The system might not function correctly without this role. May need to throw an error or use a fallback stub.
} else {
console.log("'Public' role successfully loaded and cached.");
}
} catch (error) {
console.error("Error fetching 'Public' role during middleware startup:", error);
// Handle the error during startup fetch
throw error; // Important to know if the app can proceed without the Public role
}
}
// Trigger the role fetching when the check-permissions.js module is imported/loaded
// This should happen during application startup when routes are being configured.
fetchAndCachePublicRole().catch(error => {
// Handle the case where the fetchAndCachePublicRole promise is rejected
console.error("Critical error during permissions middleware initialization:", error);
// Decide here if the process should exit if the Public role is essential.
// process.exit(1);
});
/**
* Middleware creator to check if the current user (or Public role) has a specific permission.
* @param {string} permission - The name of the required permission.
* @return {import("express").RequestHandler} Express middleware function.
*/
function checkPermissions(permission) {
return async (req, res, next) => {
const { currentUser } = req;
// 1. Check self-access bypass (only if the user is authenticated)
if (currentUser && (currentUser.id === req.params.id || currentUser.id === req.body.id)) {
return next(); // User has access to their own resource
}
// 2. Check Custom Permissions (only if the user is authenticated)
if (currentUser) {
// Ensure custom_permissions is an array before using find
const customPermissions = Array.isArray(currentUser.custom_permissions)
? currentUser.custom_permissions
: [];
const userPermission = customPermissions.find(
(cp) => cp.name === permission,
);
if (userPermission) {
return next(); // User has a custom permission
}
}
// 3. Determine the "effective" role for permission check
let effectiveRole = null;
try {
if (currentUser && currentUser.app_role) {
// User is authenticated and has an assigned role
effectiveRole = currentUser.app_role;
} else {
// User is NOT authenticated OR is authenticated but has no role
// Use the cached 'Public' role
if (!publicRoleCache) {
// If the cache is unexpectedly empty (e.g., startup error caught),
// we can try fetching the role again synchronously (less ideal) or just deny access.
console.error("Public role cache is empty. Attempting synchronous fetch...");
// Less efficient fallback option:
effectiveRole = await RolesDBApi.findBy({ name: 'Public' }); // Could be slow
if (!effectiveRole) {
// If even the synchronous attempt failed
return next(new Error("Internal Server Error: Public role missing and cannot be fetched."));
}
} else {
effectiveRole = publicRoleCache; // Use the cached object
}
}
// Check if we got a valid role object
if (!effectiveRole) {
return next(new Error("Internal Server Error: Could not determine effective role."));
}
// 4. Check Permissions on the "effective" role
// Assume the effectiveRole object (from app_role or RolesDBApi) has a getPermissions() method
// or a 'permissions' property (if permissions are eagerly loaded).
let rolePermissions = [];
if (typeof effectiveRole.getPermissions === 'function') {
rolePermissions = await effectiveRole.getPermissions(); // Get permissions asynchronously if the method exists
} else if (Array.isArray(effectiveRole.permissions)) {
rolePermissions = effectiveRole.permissions; // Or take from property if permissions are pre-loaded
} else {
console.error("Role object lacks getPermissions() method or permissions property:", effectiveRole);
return next(new Error("Internal Server Error: Invalid role object format."));
}
if (rolePermissions.find((p) => p.name === permission)) {
next(); // The "effective" role has the required permission
} else {
// The "effective" role does not have the required permission
const roleName = effectiveRole.name || 'unknown role';
next(new ValidationError('auth.forbidden', `Role '${roleName}' denied access to '${permission}'.`));
}
} catch (e) {
// Handle errors during role or permission fetching
console.error("Error during permission check:", e);
next(e); // Pass the error to the next middleware
}
};
}
const METHOD_MAP = {
POST: 'CREATE',
GET: 'READ',
PUT: 'UPDATE',
PATCH: 'UPDATE',
DELETE: 'DELETE',
};
/**
* Middleware creator to check standard CRUD permissions based on HTTP method and entity name.
* @param {string} name - The name of the entity.
* @return {import("express").RequestHandler} Express middleware function.
*/
function checkCrudPermissions(name) {
return (req, res, next) => {
// Dynamically determine the permission name (e.g., 'READ_USERS')
const permissionName = `${METHOD_MAP[req.method]}_${name.toUpperCase()}`;
// Call the checkPermissions middleware with the determined permission
checkPermissions(permissionName)(req, res, next);
};
}
module.exports = {
checkPermissions,
checkCrudPermissions,
};

View File

@ -0,0 +1,11 @@
const util = require('util');
const Multer = require('multer');
const maxSize = 10 * 1024 * 1024;
let processFile = Multer({
storage: Multer.memoryStorage(),
limits: { fileSize: maxSize },
}).single("file");
let processFileMiddleware = util.promisify(processFile);
module.exports = processFileMiddleware;

View File

@ -0,0 +1,433 @@
const express = require('express');
const Alert_eventsService = require('../services/alert_events');
const Alert_eventsDBApi = require('../db/api/alert_events');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('alert_events'));
/**
* @swagger
* components:
* schemas:
* Alert_events:
* type: object
* properties:
* message:
* type: string
* default: message
* observed_value:
* type: integer
* format: int64
*
*/
/**
* @swagger
* tags:
* name: Alert_events
* description: The Alert_events managing API
*/
/**
* @swagger
* /api/alert_events:
* post:
* security:
* - bearerAuth: []
* tags: [Alert_events]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Alert_events"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alert_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Alert_eventsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Alert_events]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Alert_events"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alert_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Alert_eventsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/alert_events/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Alert_events]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Alert_events"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alert_events"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Alert_eventsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/alert_events/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Alert_events]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alert_events"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Alert_eventsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/alert_events/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Alert_events]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alert_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Alert_eventsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/alert_events:
* get:
* security:
* - bearerAuth: []
* tags: [Alert_events]
* summary: Get all alert_events
* description: Get all alert_events
* responses:
* 200:
* description: Alert_events list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Alert_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Alert_eventsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','message',
'observed_value',
'triggered_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/alert_events/count:
* get:
* security:
* - bearerAuth: []
* tags: [Alert_events]
* summary: Count all alert_events
* description: Count all alert_events
* responses:
* 200:
* description: Alert_events count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Alert_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Alert_eventsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/alert_events/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Alert_events]
* summary: Find all alert_events that match search criteria
* description: Find all alert_events that match search criteria
* responses:
* 200:
* description: Alert_events list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Alert_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Alert_eventsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/alert_events/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Alert_events]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alert_events"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Alert_eventsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,438 @@
const express = require('express');
const AlertsService = require('../services/alerts');
const AlertsDBApi = require('../db/api/alerts');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('alerts'));
/**
* @swagger
* components:
* schemas:
* Alerts:
* type: object
* properties:
* name:
* type: string
* default: name
* rule_description:
* type: string
* default: rule_description
* threshold_value:
* type: integer
* format: int64
*
*
*
*/
/**
* @swagger
* tags:
* name: Alerts
* description: The Alerts managing API
*/
/**
* @swagger
* /api/alerts:
* post:
* security:
* - bearerAuth: []
* tags: [Alerts]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Alerts"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alerts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await AlertsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Alerts]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Alerts"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alerts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await AlertsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/alerts/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Alerts]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Alerts"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alerts"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await AlertsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/alerts/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Alerts]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alerts"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await AlertsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/alerts/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Alerts]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alerts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await AlertsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/alerts:
* get:
* security:
* - bearerAuth: []
* tags: [Alerts]
* summary: Get all alerts
* description: Get all alerts
* responses:
* 200:
* description: Alerts list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Alerts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await AlertsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','name','rule_description',
'threshold_value',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/alerts/count:
* get:
* security:
* - bearerAuth: []
* tags: [Alerts]
* summary: Count all alerts
* description: Count all alerts
* responses:
* 200:
* description: Alerts count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Alerts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await AlertsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/alerts/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Alerts]
* summary: Find all alerts that match search criteria
* description: Find all alerts that match search criteria
* responses:
* 200:
* description: Alerts list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Alerts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await AlertsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/alerts/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Alerts]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Alerts"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await AlertsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,433 @@
const express = require('express');
const Api_keysService = require('../services/api_keys');
const Api_keysDBApi = require('../db/api/api_keys');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('api_keys'));
/**
* @swagger
* components:
* schemas:
* Api_keys:
* type: object
* properties:
* name:
* type: string
* default: name
* key_fingerprint:
* type: string
* default: key_fingerprint
*
*/
/**
* @swagger
* tags:
* name: Api_keys
* description: The Api_keys managing API
*/
/**
* @swagger
* /api/api_keys:
* post:
* security:
* - bearerAuth: []
* tags: [Api_keys]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Api_keys"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Api_keys"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Api_keysService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Api_keys]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Api_keys"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Api_keys"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Api_keysService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/api_keys/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Api_keys]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Api_keys"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Api_keys"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Api_keysService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/api_keys/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Api_keys]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Api_keys"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Api_keysService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/api_keys/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Api_keys]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Api_keys"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Api_keysService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/api_keys:
* get:
* security:
* - bearerAuth: []
* tags: [Api_keys]
* summary: Get all api_keys
* description: Get all api_keys
* responses:
* 200:
* description: Api_keys list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Api_keys"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Api_keysDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','name','key_fingerprint',
'expires_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/api_keys/count:
* get:
* security:
* - bearerAuth: []
* tags: [Api_keys]
* summary: Count all api_keys
* description: Count all api_keys
* responses:
* 200:
* description: Api_keys count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Api_keys"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Api_keysDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/api_keys/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Api_keys]
* summary: Find all api_keys that match search criteria
* description: Find all api_keys that match search criteria
* responses:
* 200:
* description: Api_keys list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Api_keys"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Api_keysDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/api_keys/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Api_keys]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Api_keys"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Api_keysDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,439 @@
const express = require('express');
const AssetsService = require('../services/assets');
const AssetsDBApi = require('../db/api/assets');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('assets'));
/**
* @swagger
* components:
* schemas:
* Assets:
* type: object
* properties:
* symbol:
* type: string
* default: symbol
* name:
* type: string
* default: name
* currency:
* type: string
* default: currency
* exchange_venue:
* type: string
* default: exchange_venue
*
*/
/**
* @swagger
* tags:
* name: Assets
* description: The Assets managing API
*/
/**
* @swagger
* /api/assets:
* post:
* security:
* - bearerAuth: []
* tags: [Assets]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Assets"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Assets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await AssetsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Assets]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Assets"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Assets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await AssetsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/assets/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Assets]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Assets"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Assets"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await AssetsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/assets/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Assets]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Assets"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await AssetsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/assets/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Assets]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Assets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await AssetsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/assets:
* get:
* security:
* - bearerAuth: []
* tags: [Assets]
* summary: Get all assets
* description: Get all assets
* responses:
* 200:
* description: Assets list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Assets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await AssetsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','symbol','name','currency','exchange_venue',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/assets/count:
* get:
* security:
* - bearerAuth: []
* tags: [Assets]
* summary: Count all assets
* description: Count all assets
* responses:
* 200:
* description: Assets count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Assets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await AssetsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/assets/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Assets]
* summary: Find all assets that match search criteria
* description: Find all assets that match search criteria
* responses:
* 200:
* description: Assets list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Assets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await AssetsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/assets/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Assets]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Assets"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await AssetsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,440 @@
const express = require('express');
const Audit_eventsService = require('../services/audit_events');
const Audit_eventsDBApi = require('../db/api/audit_events');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('audit_events'));
/**
* @swagger
* components:
* schemas:
* Audit_events:
* type: object
* properties:
* resource_type:
* type: string
* default: resource_type
* resource_identifier:
* type: string
* default: resource_identifier
* details:
* type: string
* default: details
* ip_address:
* type: string
* default: ip_address
*
*
*/
/**
* @swagger
* tags:
* name: Audit_events
* description: The Audit_events managing API
*/
/**
* @swagger
* /api/audit_events:
* post:
* security:
* - bearerAuth: []
* tags: [Audit_events]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Audit_events"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Audit_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Audit_eventsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Audit_events]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Audit_events"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Audit_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Audit_eventsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/audit_events/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Audit_events]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Audit_events"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Audit_events"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Audit_eventsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/audit_events/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Audit_events]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Audit_events"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Audit_eventsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/audit_events/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Audit_events]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Audit_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Audit_eventsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/audit_events:
* get:
* security:
* - bearerAuth: []
* tags: [Audit_events]
* summary: Get all audit_events
* description: Get all audit_events
* responses:
* 200:
* description: Audit_events list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Audit_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Audit_eventsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','resource_type','resource_identifier','details','ip_address',
'occurred_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/audit_events/count:
* get:
* security:
* - bearerAuth: []
* tags: [Audit_events]
* summary: Count all audit_events
* description: Count all audit_events
* responses:
* 200:
* description: Audit_events count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Audit_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Audit_eventsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/audit_events/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Audit_events]
* summary: Find all audit_events that match search criteria
* description: Find all audit_events that match search criteria
* responses:
* 200:
* description: Audit_events list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Audit_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Audit_eventsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/audit_events/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Audit_events]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Audit_events"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Audit_eventsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

207
backend/src/routes/auth.js Normal file
View File

@ -0,0 +1,207 @@
const express = require('express');
const passport = require('passport');
const config = require('../config');
const AuthService = require('../services/auth');
const ForbiddenError = require('../services/notifications/errors/forbidden');
const EmailSender = require('../services/email');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
/**
* @swagger
* components:
* schemas:
* Auth:
* type: object
* required:
* - email
* - password
* properties:
* email:
* type: string
* default: admin@flatlogic.com
* description: User email
* password:
* type: string
* default: password
* description: User password
*/
/**
* @swagger
* tags:
* name: Auth
* description: Authorization operations
*/
/**
* @swagger
* /api/auth/signin/local:
* post:
* tags: [Auth]
* summary: Logs user into the system
* description: Logs user into the system
* requestBody:
* description: Set valid user email and password
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Auth"
* responses:
* 200:
* description: Successful login
* 400:
* description: Invalid username/password supplied
* x-codegen-request-body-name: body
*/
router.post('/signin/local', wrapAsync(async (req, res) => {
const payload = await AuthService.signin(req.body.email, req.body.password, req,);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/auth/me:
* get:
* security:
* - bearerAuth: []
* tags: [Auth]
* summary: Get current authorized user info
* description: Get current authorized user info
* responses:
* 200:
* description: Successful retrieval of current authorized user data
* 400:
* description: Invalid username/password supplied
* x-codegen-request-body-name: body
*/
router.get('/me', passport.authenticate('jwt', {session: false}), (req, res) => {
if (!req.currentUser || !req.currentUser.id) {
throw new ForbiddenError();
}
const payload = req.currentUser;
delete payload.password;
res.status(200).send(payload);
});
router.put('/password-reset', wrapAsync(async (req, res) => {
const payload = await AuthService.passwordReset(req.body.token, req.body.password, req,);
res.status(200).send(payload);
}));
router.put('/password-update', passport.authenticate('jwt', {session: false}), wrapAsync(async (req, res) => {
const payload = await AuthService.passwordUpdate(req.body.currentPassword, req.body.newPassword, req);
res.status(200).send(payload);
}));
router.post('/send-email-address-verification-email', passport.authenticate('jwt', {session: false}), wrapAsync(async (req, res) => {
if (!req.currentUser) {
throw new ForbiddenError();
}
await AuthService.sendEmailAddressVerificationEmail(req.currentUser.email);
const payload = true;
res.status(200).send(payload);
}));
router.post('/send-password-reset-email', wrapAsync(async (req, res) => {
const link = new URL(req.headers.referer);
await AuthService.sendPasswordResetEmail(req.body.email, 'register', link.host,);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/auth/signup:
* post:
* tags: [Auth]
* summary: Register new user into the system
* description: Register new user into the system
* requestBody:
* description: Set valid user email and password
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Auth"
* responses:
* 200:
* description: New user successfully signed up
* 400:
* description: Invalid username/password supplied
* 500:
* description: Some server error
* x-codegen-request-body-name: body
*/
router.post('/signup', wrapAsync(async (req, res) => {
const link = new URL(req.headers.referer);
const payload = await AuthService.signup(
req.body.email,
req.body.password,
req,
link.host,
)
res.status(200).send(payload);
}));
router.put('/profile', passport.authenticate('jwt', {session: false}), wrapAsync(async (req, res) => {
if (!req.currentUser || !req.currentUser.id) {
throw new ForbiddenError();
}
await AuthService.updateProfile(req.body.profile, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
router.put('/verify-email', wrapAsync(async (req, res) => {
const payload = await AuthService.verifyEmail(req.body.token, req, req.headers.referer)
res.status(200).send(payload);
}));
router.get('/email-configured', (req, res) => {
const payload = EmailSender.isConfigured;
res.status(200).send(payload);
});
router.get('/signin/google', (req, res, next) => {
passport.authenticate("google", {scope: ["profile", "email"], state: req.query.app})(req, res, next);
});
router.get('/signin/google/callback', passport.authenticate("google", {failureRedirect: "/login", session: false}),
function (req, res) {
socialRedirect(res, req.query.state, req.user.token, config);
}
);
router.get('/signin/microsoft', (req, res, next) => {
passport.authenticate("microsoft", {
scope: ["https://graph.microsoft.com/user.read openid"],
state: req.query.app
})(req, res, next);
});
router.get('/signin/microsoft/callback', passport.authenticate("microsoft", {
failureRedirect: "/login",
session: false
}),
function (req, res) {
socialRedirect(res, req.query.state, req.user.token, config);
}
);
router.use('/', require('../helpers').commonErrorHandler);
function socialRedirect(res, state, token, config) {
res.redirect(config.uiUrl + "/login?token=" + token);
}
module.exports = router;

View File

View File

@ -0,0 +1,441 @@
const express = require('express');
const Data_sourcesService = require('../services/data_sources');
const Data_sourcesDBApi = require('../db/api/data_sources');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('data_sources'));
/**
* @swagger
* components:
* schemas:
* Data_sources:
* type: object
* properties:
* name:
* type: string
* default: name
* connection_type:
* type: string
* default: connection_type
* coverage_description:
* type: string
* default: coverage_description
* last_failure_reason:
* type: string
* default: last_failure_reason
*
*
*
*/
/**
* @swagger
* tags:
* name: Data_sources
* description: The Data_sources managing API
*/
/**
* @swagger
* /api/data_sources:
* post:
* security:
* - bearerAuth: []
* tags: [Data_sources]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Data_sources"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Data_sources"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Data_sourcesService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Data_sources]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Data_sources"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Data_sources"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Data_sourcesService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/data_sources/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Data_sources]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Data_sources"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Data_sources"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Data_sourcesService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/data_sources/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Data_sources]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Data_sources"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Data_sourcesService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/data_sources/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Data_sources]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Data_sources"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Data_sourcesService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/data_sources:
* get:
* security:
* - bearerAuth: []
* tags: [Data_sources]
* summary: Get all data_sources
* description: Get all data_sources
* responses:
* 200:
* description: Data_sources list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Data_sources"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Data_sourcesDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','name','connection_type','coverage_description','last_failure_reason',
'last_success_at','last_failure_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/data_sources/count:
* get:
* security:
* - bearerAuth: []
* tags: [Data_sources]
* summary: Count all data_sources
* description: Count all data_sources
* responses:
* 200:
* description: Data_sources count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Data_sources"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Data_sourcesDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/data_sources/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Data_sources]
* summary: Find all data_sources that match search criteria
* description: Find all data_sources that match search criteria
* responses:
* 200:
* description: Data_sources list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Data_sources"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Data_sourcesDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/data_sources/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Data_sources]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Data_sources"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Data_sourcesDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,439 @@
const express = require('express');
const Factor_attributionsService = require('../services/factor_attributions');
const Factor_attributionsDBApi = require('../db/api/factor_attributions');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('factor_attributions'));
/**
* @swagger
* components:
* schemas:
* Factor_attributions:
* type: object
* properties:
* factor_name:
* type: string
* default: factor_name
* notes:
* type: string
* default: notes
* contribution:
* type: integer
* format: int64
* importance:
* type: integer
* format: int64
*
*/
/**
* @swagger
* tags:
* name: Factor_attributions
* description: The Factor_attributions managing API
*/
/**
* @swagger
* /api/factor_attributions:
* post:
* security:
* - bearerAuth: []
* tags: [Factor_attributions]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Factor_attributions"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Factor_attributions"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Factor_attributionsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Factor_attributions]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Factor_attributions"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Factor_attributions"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Factor_attributionsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/factor_attributions/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Factor_attributions]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Factor_attributions"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Factor_attributions"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Factor_attributionsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/factor_attributions/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Factor_attributions]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Factor_attributions"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Factor_attributionsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/factor_attributions/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Factor_attributions]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Factor_attributions"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Factor_attributionsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/factor_attributions:
* get:
* security:
* - bearerAuth: []
* tags: [Factor_attributions]
* summary: Get all factor_attributions
* description: Get all factor_attributions
* responses:
* 200:
* description: Factor_attributions list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Factor_attributions"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Factor_attributionsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','factor_name','notes',
'contribution','importance',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/factor_attributions/count:
* get:
* security:
* - bearerAuth: []
* tags: [Factor_attributions]
* summary: Count all factor_attributions
* description: Count all factor_attributions
* responses:
* 200:
* description: Factor_attributions count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Factor_attributions"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Factor_attributionsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/factor_attributions/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Factor_attributions]
* summary: Find all factor_attributions that match search criteria
* description: Find all factor_attributions that match search criteria
* responses:
* 200:
* description: Factor_attributions list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Factor_attributions"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Factor_attributionsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/factor_attributions/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Factor_attributions]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Factor_attributions"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Factor_attributionsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,433 @@
const express = require('express');
const Feature_setsService = require('../services/feature_sets');
const Feature_setsDBApi = require('../db/api/feature_sets');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('feature_sets'));
/**
* @swagger
* components:
* schemas:
* Feature_sets:
* type: object
* properties:
* name:
* type: string
* default: name
* description:
* type: string
* default: description
*
*/
/**
* @swagger
* tags:
* name: Feature_sets
* description: The Feature_sets managing API
*/
/**
* @swagger
* /api/feature_sets:
* post:
* security:
* - bearerAuth: []
* tags: [Feature_sets]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Feature_sets"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Feature_sets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Feature_setsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Feature_sets]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Feature_sets"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Feature_sets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Feature_setsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/feature_sets/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Feature_sets]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Feature_sets"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Feature_sets"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Feature_setsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/feature_sets/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Feature_sets]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Feature_sets"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Feature_setsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/feature_sets/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Feature_sets]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Feature_sets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Feature_setsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/feature_sets:
* get:
* security:
* - bearerAuth: []
* tags: [Feature_sets]
* summary: Get all feature_sets
* description: Get all feature_sets
* responses:
* 200:
* description: Feature_sets list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Feature_sets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Feature_setsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','name','description',
'effective_from_at','effective_to_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/feature_sets/count:
* get:
* security:
* - bearerAuth: []
* tags: [Feature_sets]
* summary: Count all feature_sets
* description: Count all feature_sets
* responses:
* 200:
* description: Feature_sets count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Feature_sets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Feature_setsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/feature_sets/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Feature_sets]
* summary: Find all feature_sets that match search criteria
* description: Find all feature_sets that match search criteria
* responses:
* 200:
* description: Feature_sets list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Feature_sets"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Feature_setsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/feature_sets/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Feature_sets]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Feature_sets"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Feature_setsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,32 @@
const express = require('express');
const config = require('../config');
const path = require('path');
const passport = require('passport');
const services = require('../services/file');
const router = express.Router();
router.get('/download', (req, res) => {
if (process.env.NODE_ENV == "production" || process.env.NEXT_PUBLIC_BACK_API) {
services.downloadGCloud(req, res);
}
else {
services.downloadLocal(req, res);
}
});
router.post('/upload/:table/:field', passport.authenticate('jwt', {session: false}), (req, res) => {
const fileName = `${req.params.table}/${req.params.field}`;
if (process.env.NODE_ENV == "production" || process.env.NEXT_PUBLIC_BACK_API) {
services.uploadGCloud(fileName, req, res);
}
else {
services.uploadLocal(fileName, {
entity: null,
maxFileSize: 10 * 1024 * 1024,
folderIncludesAuthenticationUid: false,
})(req, res);
}
});
module.exports = router;

View File

@ -0,0 +1,449 @@
const express = require('express');
const ForecastsService = require('../services/forecasts');
const ForecastsDBApi = require('../db/api/forecasts');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('forecasts'));
/**
* @swagger
* components:
* schemas:
* Forecasts:
* type: object
* properties:
* explainability_summary:
* type: string
* default: explainability_summary
* point_estimate:
* type: integer
* format: int64
* p10:
* type: integer
* format: int64
* p50:
* type: integer
* format: int64
* p90:
* type: integer
* format: int64
* volatility_forecast:
* type: integer
* format: int64
* signal_confidence:
* type: integer
* format: int64
*
*
*/
/**
* @swagger
* tags:
* name: Forecasts
* description: The Forecasts managing API
*/
/**
* @swagger
* /api/forecasts:
* post:
* security:
* - bearerAuth: []
* tags: [Forecasts]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Forecasts"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Forecasts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await ForecastsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Forecasts]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Forecasts"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Forecasts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await ForecastsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/forecasts/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Forecasts]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Forecasts"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Forecasts"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await ForecastsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/forecasts/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Forecasts]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Forecasts"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await ForecastsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/forecasts/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Forecasts]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Forecasts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await ForecastsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/forecasts:
* get:
* security:
* - bearerAuth: []
* tags: [Forecasts]
* summary: Get all forecasts
* description: Get all forecasts
* responses:
* 200:
* description: Forecasts list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Forecasts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await ForecastsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','explainability_summary',
'point_estimate','p10','p50','p90','volatility_forecast','signal_confidence',
'as_of_at','target_time_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/forecasts/count:
* get:
* security:
* - bearerAuth: []
* tags: [Forecasts]
* summary: Count all forecasts
* description: Count all forecasts
* responses:
* 200:
* description: Forecasts count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Forecasts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await ForecastsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/forecasts/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Forecasts]
* summary: Find all forecasts that match search criteria
* description: Find all forecasts that match search criteria
* responses:
* 200:
* description: Forecasts list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Forecasts"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await ForecastsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/forecasts/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Forecasts]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Forecasts"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await ForecastsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,441 @@
const express = require('express');
const Geopolitical_eventsService = require('../services/geopolitical_events');
const Geopolitical_eventsDBApi = require('../db/api/geopolitical_events');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('geopolitical_events'));
/**
* @swagger
* components:
* schemas:
* Geopolitical_events:
* type: object
* properties:
* title:
* type: string
* default: title
* summary:
* type: string
* default: summary
* source_summary:
* type: string
* default: source_summary
* confidence_score:
* type: integer
* format: int64
*
*
*
*/
/**
* @swagger
* tags:
* name: Geopolitical_events
* description: The Geopolitical_events managing API
*/
/**
* @swagger
* /api/geopolitical_events:
* post:
* security:
* - bearerAuth: []
* tags: [Geopolitical_events]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Geopolitical_events"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Geopolitical_eventsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Geopolitical_events]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Geopolitical_events"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Geopolitical_eventsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/geopolitical_events/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Geopolitical_events]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Geopolitical_events"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_events"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Geopolitical_eventsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/geopolitical_events/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Geopolitical_events]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_events"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Geopolitical_eventsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/geopolitical_events/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Geopolitical_events]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Geopolitical_eventsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/geopolitical_events:
* get:
* security:
* - bearerAuth: []
* tags: [Geopolitical_events]
* summary: Get all geopolitical_events
* description: Get all geopolitical_events
* responses:
* 200:
* description: Geopolitical_events list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Geopolitical_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Geopolitical_eventsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','title','summary','source_summary',
'confidence_score',
'event_start_at','event_end_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/geopolitical_events/count:
* get:
* security:
* - bearerAuth: []
* tags: [Geopolitical_events]
* summary: Count all geopolitical_events
* description: Count all geopolitical_events
* responses:
* 200:
* description: Geopolitical_events count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Geopolitical_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Geopolitical_eventsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/geopolitical_events/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Geopolitical_events]
* summary: Find all geopolitical_events that match search criteria
* description: Find all geopolitical_events that match search criteria
* responses:
* 200:
* description: Geopolitical_events list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Geopolitical_events"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Geopolitical_eventsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/geopolitical_events/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Geopolitical_events]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_events"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Geopolitical_eventsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,437 @@
const express = require('express');
const Geopolitical_scoresService = require('../services/geopolitical_scores');
const Geopolitical_scoresDBApi = require('../db/api/geopolitical_scores');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('geopolitical_scores'));
/**
* @swagger
* components:
* schemas:
* Geopolitical_scores:
* type: object
* properties:
* methodology_note:
* type: string
* default: methodology_note
* score_value:
* type: integer
* format: int64
* iran_conflict_weight:
* type: integer
* format: int64
*
*
*/
/**
* @swagger
* tags:
* name: Geopolitical_scores
* description: The Geopolitical_scores managing API
*/
/**
* @swagger
* /api/geopolitical_scores:
* post:
* security:
* - bearerAuth: []
* tags: [Geopolitical_scores]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Geopolitical_scores"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_scores"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Geopolitical_scoresService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Geopolitical_scores]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Geopolitical_scores"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_scores"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Geopolitical_scoresService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/geopolitical_scores/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Geopolitical_scores]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Geopolitical_scores"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_scores"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Geopolitical_scoresService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/geopolitical_scores/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Geopolitical_scores]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_scores"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Geopolitical_scoresService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/geopolitical_scores/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Geopolitical_scores]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_scores"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Geopolitical_scoresService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/geopolitical_scores:
* get:
* security:
* - bearerAuth: []
* tags: [Geopolitical_scores]
* summary: Get all geopolitical_scores
* description: Get all geopolitical_scores
* responses:
* 200:
* description: Geopolitical_scores list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Geopolitical_scores"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Geopolitical_scoresDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','methodology_note',
'score_value','iran_conflict_weight',
'as_of_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/geopolitical_scores/count:
* get:
* security:
* - bearerAuth: []
* tags: [Geopolitical_scores]
* summary: Count all geopolitical_scores
* description: Count all geopolitical_scores
* responses:
* 200:
* description: Geopolitical_scores count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Geopolitical_scores"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Geopolitical_scoresDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/geopolitical_scores/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Geopolitical_scores]
* summary: Find all geopolitical_scores that match search criteria
* description: Find all geopolitical_scores that match search criteria
* responses:
* 200:
* description: Geopolitical_scores list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Geopolitical_scores"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Geopolitical_scoresDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/geopolitical_scores/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Geopolitical_scores]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Geopolitical_scores"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Geopolitical_scoresDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,440 @@
const express = require('express');
const Macro_indicatorsService = require('../services/macro_indicators');
const Macro_indicatorsDBApi = require('../db/api/macro_indicators');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('macro_indicators'));
/**
* @swagger
* components:
* schemas:
* Macro_indicators:
* type: object
* properties:
* code:
* type: string
* default: code
* name:
* type: string
* default: name
* unit:
* type: string
* default: unit
* region:
* type: string
* default: region
*
*
*/
/**
* @swagger
* tags:
* name: Macro_indicators
* description: The Macro_indicators managing API
*/
/**
* @swagger
* /api/macro_indicators:
* post:
* security:
* - bearerAuth: []
* tags: [Macro_indicators]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Macro_indicators"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Macro_indicators"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Macro_indicatorsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Macro_indicators]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Macro_indicators"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Macro_indicators"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Macro_indicatorsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/macro_indicators/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Macro_indicators]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Macro_indicators"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Macro_indicators"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Macro_indicatorsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/macro_indicators/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Macro_indicators]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Macro_indicators"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Macro_indicatorsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/macro_indicators/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Macro_indicators]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Macro_indicators"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Macro_indicatorsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/macro_indicators:
* get:
* security:
* - bearerAuth: []
* tags: [Macro_indicators]
* summary: Get all macro_indicators
* description: Get all macro_indicators
* responses:
* 200:
* description: Macro_indicators list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Macro_indicators"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Macro_indicatorsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','code','name','unit','region',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/macro_indicators/count:
* get:
* security:
* - bearerAuth: []
* tags: [Macro_indicators]
* summary: Count all macro_indicators
* description: Count all macro_indicators
* responses:
* 200:
* description: Macro_indicators count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Macro_indicators"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Macro_indicatorsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/macro_indicators/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Macro_indicators]
* summary: Find all macro_indicators that match search criteria
* description: Find all macro_indicators that match search criteria
* responses:
* 200:
* description: Macro_indicators list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Macro_indicators"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Macro_indicatorsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/macro_indicators/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Macro_indicators]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Macro_indicators"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Macro_indicatorsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,438 @@
const express = require('express');
const Mining_companiesService = require('../services/mining_companies');
const Mining_companiesDBApi = require('../db/api/mining_companies');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('mining_companies'));
/**
* @swagger
* components:
* schemas:
* Mining_companies:
* type: object
* properties:
* ticker:
* type: string
* default: ticker
* company_name:
* type: string
* default: company_name
* country:
* type: string
* default: country
* primary_mines:
* type: string
* default: primary_mines
*/
/**
* @swagger
* tags:
* name: Mining_companies
* description: The Mining_companies managing API
*/
/**
* @swagger
* /api/mining_companies:
* post:
* security:
* - bearerAuth: []
* tags: [Mining_companies]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Mining_companies"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_companies"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Mining_companiesService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Mining_companies]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Mining_companies"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_companies"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Mining_companiesService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/mining_companies/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Mining_companies]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Mining_companies"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_companies"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Mining_companiesService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/mining_companies/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Mining_companies]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_companies"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Mining_companiesService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/mining_companies/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Mining_companies]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_companies"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Mining_companiesService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/mining_companies:
* get:
* security:
* - bearerAuth: []
* tags: [Mining_companies]
* summary: Get all mining_companies
* description: Get all mining_companies
* responses:
* 200:
* description: Mining_companies list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Mining_companies"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Mining_companiesDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','ticker','company_name','country','primary_mines',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/mining_companies/count:
* get:
* security:
* - bearerAuth: []
* tags: [Mining_companies]
* summary: Count all mining_companies
* description: Count all mining_companies
* responses:
* 200:
* description: Mining_companies count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Mining_companies"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Mining_companiesDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/mining_companies/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Mining_companies]
* summary: Find all mining_companies that match search criteria
* description: Find all mining_companies that match search criteria
* responses:
* 200:
* description: Mining_companies list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Mining_companies"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Mining_companiesDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/mining_companies/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Mining_companies]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_companies"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Mining_companiesDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,457 @@
const express = require('express');
const Mining_fundamentalsService = require('../services/mining_fundamentals');
const Mining_fundamentalsDBApi = require('../db/api/mining_fundamentals');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('mining_fundamentals'));
/**
* @swagger
* components:
* schemas:
* Mining_fundamentals:
* type: object
* properties:
* notes:
* type: string
* default: notes
* production_oz:
* type: integer
* format: int64
* all_in_sustaining_cost:
* type: integer
* format: int64
* cash_cost:
* type: integer
* format: int64
* reserves_oz:
* type: integer
* format: int64
* revenue:
* type: integer
* format: int64
* ebitda:
* type: integer
* format: int64
* free_cash_flow:
* type: integer
* format: int64
* debt_to_equity:
* type: integer
* format: int64
* operating_margin:
* type: integer
* format: int64
*
*/
/**
* @swagger
* tags:
* name: Mining_fundamentals
* description: The Mining_fundamentals managing API
*/
/**
* @swagger
* /api/mining_fundamentals:
* post:
* security:
* - bearerAuth: []
* tags: [Mining_fundamentals]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Mining_fundamentals"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_fundamentals"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Mining_fundamentalsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Mining_fundamentals]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Mining_fundamentals"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_fundamentals"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Mining_fundamentalsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/mining_fundamentals/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Mining_fundamentals]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Mining_fundamentals"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_fundamentals"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Mining_fundamentalsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/mining_fundamentals/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Mining_fundamentals]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_fundamentals"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Mining_fundamentalsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/mining_fundamentals/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Mining_fundamentals]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_fundamentals"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Mining_fundamentalsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/mining_fundamentals:
* get:
* security:
* - bearerAuth: []
* tags: [Mining_fundamentals]
* summary: Get all mining_fundamentals
* description: Get all mining_fundamentals
* responses:
* 200:
* description: Mining_fundamentals list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Mining_fundamentals"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Mining_fundamentalsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','notes',
'production_oz','all_in_sustaining_cost','cash_cost','reserves_oz','revenue','ebitda','free_cash_flow','debt_to_equity','operating_margin',
'period_end_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/mining_fundamentals/count:
* get:
* security:
* - bearerAuth: []
* tags: [Mining_fundamentals]
* summary: Count all mining_fundamentals
* description: Count all mining_fundamentals
* responses:
* 200:
* description: Mining_fundamentals count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Mining_fundamentals"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Mining_fundamentalsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/mining_fundamentals/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Mining_fundamentals]
* summary: Find all mining_fundamentals that match search criteria
* description: Find all mining_fundamentals that match search criteria
* responses:
* 200:
* description: Mining_fundamentals list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Mining_fundamentals"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Mining_fundamentalsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/mining_fundamentals/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Mining_fundamentals]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Mining_fundamentals"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Mining_fundamentalsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,437 @@
const express = require('express');
const Model_runsService = require('../services/model_runs');
const Model_runsDBApi = require('../db/api/model_runs');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('model_runs'));
/**
* @swagger
* components:
* schemas:
* Model_runs:
* type: object
* properties:
* data_window:
* type: string
* default: data_window
* metrics_summary:
* type: string
* default: metrics_summary
* error_details:
* type: string
* default: error_details
*
*
*/
/**
* @swagger
* tags:
* name: Model_runs
* description: The Model_runs managing API
*/
/**
* @swagger
* /api/model_runs:
* post:
* security:
* - bearerAuth: []
* tags: [Model_runs]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Model_runs"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Model_runs"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Model_runsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Model_runs]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Model_runs"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Model_runs"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await Model_runsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/model_runs/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Model_runs]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Model_runs"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Model_runs"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await Model_runsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/model_runs/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Model_runs]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Model_runs"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await Model_runsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/model_runs/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Model_runs]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Model_runs"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await Model_runsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/model_runs:
* get:
* security:
* - bearerAuth: []
* tags: [Model_runs]
* summary: Get all model_runs
* description: Get all model_runs
* responses:
* 200:
* description: Model_runs list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Model_runs"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await Model_runsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','data_window','metrics_summary','error_details',
'started_at','ended_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/model_runs/count:
* get:
* security:
* - bearerAuth: []
* tags: [Model_runs]
* summary: Count all model_runs
* description: Count all model_runs
* responses:
* 200:
* description: Model_runs count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Model_runs"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await Model_runsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/model_runs/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Model_runs]
* summary: Find all model_runs that match search criteria
* description: Find all model_runs that match search criteria
* responses:
* 200:
* description: Model_runs list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Model_runs"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await Model_runsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/model_runs/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Model_runs]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Model_runs"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await Model_runsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

View File

@ -0,0 +1,444 @@
const express = require('express');
const ModelsService = require('../services/models');
const ModelsDBApi = require('../db/api/models');
const wrapAsync = require('../helpers').wrapAsync;
const router = express.Router();
const { parse } = require('json2csv');
const {
checkCrudPermissions,
} = require('../middlewares/check-permissions');
router.use(checkCrudPermissions('models'));
/**
* @swagger
* components:
* schemas:
* Models:
* type: object
* properties:
* name:
* type: string
* default: name
* objective:
* type: string
* default: objective
* artifact_uri:
* type: string
* default: artifact_uri
* config_snapshot:
* type: string
* default: config_snapshot
* target_metric_value:
* type: integer
* format: int64
*
*
*
*/
/**
* @swagger
* tags:
* name: Models
* description: The Models managing API
*/
/**
* @swagger
* /api/models:
* post:
* security:
* - bearerAuth: []
* tags: [Models]
* summary: Add new item
* description: Add new item
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Models"
* responses:
* 200:
* description: The item was successfully added
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Models"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*/
router.post('/', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await ModelsService.create(req.body.data, req.currentUser, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/budgets/bulk-import:
* post:
* security:
* - bearerAuth: []
* tags: [Models]
* summary: Bulk import items
* description: Bulk import items
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* data:
* description: Data of the updated items
* type: array
* items:
* $ref: "#/components/schemas/Models"
* responses:
* 200:
* description: The items were successfully imported
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Models"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 405:
* description: Invalid input data
* 500:
* description: Some server error
*
*/
router.post('/bulk-import', wrapAsync(async (req, res) => {
const referer = req.headers.referer || `${req.protocol}://${req.hostname}${req.originalUrl}`;
const link = new URL(referer);
await ModelsService.bulkImport(req, res, true, link.host);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/models/{id}:
* put:
* security:
* - bearerAuth: []
* tags: [Models]
* summary: Update the data of the selected item
* description: Update the data of the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to update
* required: true
* schema:
* type: string
* requestBody:
* description: Set new item data
* required: true
* content:
* application/json:
* schema:
* properties:
* id:
* description: ID of the updated item
* type: string
* data:
* description: Data of the updated item
* type: object
* $ref: "#/components/schemas/Models"
* required:
* - id
* responses:
* 200:
* description: The item data was successfully updated
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Models"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.put('/:id', wrapAsync(async (req, res) => {
await ModelsService.update(req.body.data, req.body.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/models/{id}:
* delete:
* security:
* - bearerAuth: []
* tags: [Models]
* summary: Delete the selected item
* description: Delete the selected item
* parameters:
* - in: path
* name: id
* description: Item ID to delete
* required: true
* schema:
* type: string
* responses:
* 200:
* description: The item was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Models"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.delete('/:id', wrapAsync(async (req, res) => {
await ModelsService.remove(req.params.id, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/models/deleteByIds:
* post:
* security:
* - bearerAuth: []
* tags: [Models]
* summary: Delete the selected item list
* description: Delete the selected item list
* requestBody:
* required: true
* content:
* application/json:
* schema:
* properties:
* ids:
* description: IDs of the updated items
* type: array
* responses:
* 200:
* description: The items was successfully deleted
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Models"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Items not found
* 500:
* description: Some server error
*/
router.post('/deleteByIds', wrapAsync(async (req, res) => {
await ModelsService.deleteByIds(req.body.data, req.currentUser);
const payload = true;
res.status(200).send(payload);
}));
/**
* @swagger
* /api/models:
* get:
* security:
* - bearerAuth: []
* tags: [Models]
* summary: Get all models
* description: Get all models
* responses:
* 200:
* description: Models list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Models"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/', wrapAsync(async (req, res) => {
const filetype = req.query.filetype
const currentUser = req.currentUser;
const payload = await ModelsDBApi.findAll(
req.query, { currentUser }
);
if (filetype && filetype === 'csv') {
const fields = ['id','name','objective','artifact_uri','config_snapshot',
'target_metric_value',
'last_trained_at',
];
const opts = { fields };
try {
const csv = parse(payload.rows, opts);
res.status(200).attachment(csv);
res.send(csv)
} catch (err) {
console.error(err);
}
} else {
res.status(200).send(payload);
}
}));
/**
* @swagger
* /api/models/count:
* get:
* security:
* - bearerAuth: []
* tags: [Models]
* summary: Count all models
* description: Count all models
* responses:
* 200:
* description: Models count successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Models"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/count', wrapAsync(async (req, res) => {
const currentUser = req.currentUser;
const payload = await ModelsDBApi.findAll(
req.query,
null,
{ countOnly: true, currentUser }
);
res.status(200).send(payload);
}));
/**
* @swagger
* /api/models/autocomplete:
* get:
* security:
* - bearerAuth: []
* tags: [Models]
* summary: Find all models that match search criteria
* description: Find all models that match search criteria
* responses:
* 200:
* description: Models list successfully received
* content:
* application/json:
* schema:
* type: array
* items:
* $ref: "#/components/schemas/Models"
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Data not found
* 500:
* description: Some server error
*/
router.get('/autocomplete', async (req, res) => {
const payload = await ModelsDBApi.findAllAutocomplete(
req.query.query,
req.query.limit,
req.query.offset,
);
res.status(200).send(payload);
});
/**
* @swagger
* /api/models/{id}:
* get:
* security:
* - bearerAuth: []
* tags: [Models]
* summary: Get selected item
* description: Get selected item
* parameters:
* - in: path
* name: id
* description: ID of item to get
* required: true
* schema:
* type: string
* responses:
* 200:
* description: Selected item successfully received
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Models"
* 400:
* description: Invalid ID supplied
* 401:
* $ref: "#/components/responses/UnauthorizedError"
* 404:
* description: Item not found
* 500:
* description: Some server error
*/
router.get('/:id', wrapAsync(async (req, res) => {
const payload = await ModelsDBApi.findBy(
{ id: req.params.id },
);
res.status(200).send(payload);
}));
router.use('/', require('../helpers').commonErrorHandler);
module.exports = router;

Some files were not shown because too many files have changed in this diff Show More