Initial version

This commit is contained in:
Flatlogic Bot 2026-02-13 16:37:10 +00:00
commit 2e8014b884
749 changed files with 245289 additions and 0 deletions

305
.cursorrules Normal file
View File

@ -0,0 +1,305 @@
# Cursor Rules - Group 1: Development Philosophy & Coding Conventions
1. Overall Architecture & Structure:
- Enforce a clear separation of concerns between the backend and the frontend:
- **Backend**: Use Express for routing, Passport for authentication, and Swagger for API documentation. Organize code into modules such as routes, services, and helpers.
- **Example**:
- Routes: `src/routes/auth.js` for authentication routes.
- Services: `src/services/auth.js` for authentication logic.
- Helpers: `src/helpers/wrapAsync.js` for wrapping asynchronous functions.
- **Frontend**: Use Next.js with React and TypeScript. Structure components using functional components, hooks, and layouts.
- **Example**:
- Pages: `pages/index.tsx` for the main page.
- Components: `components/Header.tsx` for the header component.
- Layouts: `layouts/MainLayout.tsx` for common page layouts.
- Ensure that backend modules and frontend components are organized for reusability and maintainability:
- **Backend**: Separate business logic into services and use middleware for common tasks.
- **Frontend**: Use reusable components and hooks to manage state and lifecycle.
2. Coding Style & Formatting:
- For the backend (JavaScript):
• Use ES6+ features (const/let, arrow functions) consistently.
• Follow Prettier and ESLint configurations (e.g., consistent 2-space indentation, semicolons, and single quotes).
• Maintain clear asynchronous patterns with helper wrappers (e.g., wrapAsync).
- **Example from auth.js**:
```javascript
router.post('/signin/local', wrapAsync(async (req, res) => {
const payload = await AuthService.signin(req.body.email, req.body.password, req);
res.status(200).send(payload);
}));
```
• Document API endpoints with inline Swagger comments to ensure API clarity and consistency.
- **Example**:
```javascript
/**
* @swagger
* /api/auth/signin:
* post:
* summary: Sign in a user
* responses:
* 200:
* description: Successful login
*/
```
- For the frontend (TypeScript/React):
• Use functional components with strict typing and separation of concerns.
- **Example**:
```typescript
const Button: React.FC<{ onClick: () => void }> = ({ onClick }) => (
<button onClick={onClick}>Click me</button>
);
```
• Follow naming conventions: PascalCase for components and types/interfaces, camelCase for variables, hooks, and function names.
- **Example**:
```typescript
const useCustomHook = () => {
const [state, setState] = useState(false);
return [state, setState];
};
```
• Utilize hooks (useEffect, useState) to manage state and lifecycle in a clear and concise manner.
- **Example**:
```typescript
useEffect(() => {
console.log('Component mounted');
}, []);
```
3. Code Quality & Best Practices:
- Ensure code modularity by splitting complex logic into smaller, testable units.
- **Example**: In `auth.js`, routes are separated from business logic, which is handled in `AuthService`.
- Write self-documenting code and add comments where the logic is non-trivial.
- **Example**: Use descriptive function and variable names in `auth.js`, and add comments for complex asynchronous operations.
- Embrace declarative programming and adhere to SOLID principles.
- **Example**: In service functions, ensure each function has a single responsibility and dependencies are injected rather than hardcoded.
4. Consistency & Tools Integration:
- Leverage existing tools like Prettier and ESLint to automatically enforce style and formatting rules.
- **Example**: Use `.prettierrc` and `.eslintrc.cjs` for configuration in your project.
- Use TypeScript in the frontend to ensure type safety and catch errors early.
- **Example**: Define interfaces and types in your React components to enforce strict typing.
- Maintain uniformity in API design and error handling strategies.
- **Example**: Consistently use Passport for authentication and a common error handling middleware in `auth.js`.
## Group 2 Naming Conventions
1. File Naming and Structure:
• Frontend:
- Page Files: Use lower-case filenames (e.g., index.tsx) as prescribed by Next.js conventions.
- **Example**: `pages/index.tsx`, `pages/about.tsx`
- Component Files: Use PascalCase for React component files (e.g., WebSiteHeader.tsx, NavBar.tsx).
- **Example**: `components/Header.tsx`, `components/Footer.tsx`
- Directories: Use clear, descriptive names (e.g., 'pages', 'components', 'WebPageComponents').
- **Example**: `src/pages`, `src/components`
• Backend:
- Use lower-case filenames for modules (e.g., index.js, auth.js, projects.js).
- **Example**: `routes/auth.js`, `services/user.js`
- When needed, use hyphenation for clarity, but maintain consistency.
- **Example**: `helpers/wrap-async.js`
2. Component and Module Naming:
• Frontend:
- React Components: Define components in PascalCase.
- TypeScript Interfaces/Types: Use PascalCase (e.g., WebSiteHeaderProps).
• Backend:
- Classes (if any) and constructors should be in PascalCase; most helper functions and modules use camelCase.
3. Variable, Function, and Hook Naming:
• Use camelCase for variables and function names in both frontend and backend.
- **Example**:
```javascript
const userName = 'John Doe';
function handleLogin() { ... }
```
• Custom Hooks: Prefix with 'use' (e.g., useAuth, useForm).
- **Example**:
```typescript
const useAuth = () => {
const [isAuthenticated, setIsAuthenticated] = useState(false);
return { isAuthenticated, setIsAuthenticated };
};
```
4. Consistency and Readability:
• Maintain uniform naming across the project to ensure clarity and ease of maintenance.
- **Example**: Use consistent naming conventions for variables, functions, and components, such as camelCase for variables and functions, and PascalCase for components.
- **Example**: In `auth.js`, ensure that all function names clearly describe their purpose, such as `handleLogin` or `validateUserInput`.
## Group 3 Frontend & React Best Practices
1. Use of Functional Components & TypeScript:
• Build all components as functional components.
- **Example**:
```typescript
const Header: React.FC = () => {
return <header>Header Content</header>;
};
```
• Leverage TypeScript for static type checking and enforce strict prop and state types.
- **Example**:
```typescript
interface ButtonProps {
onClick: () => void;
}
const Button: React.FC<ButtonProps> = ({ onClick }) => (
<button onClick={onClick}>Click me</button>
);
```
2. Effective Use of React Hooks:
• Utilize useState and useEffect appropriately with proper dependency arrays.
- **Example**:
```typescript
const [count, setCount] = useState(0);
useEffect(() => {
console.log('Component mounted');
}, []);
```
• Create custom hooks to encapsulate shared logic (e.g., useAppSelector).
- **Example**:
```typescript
const useAuth = () => {
const [isAuthenticated, setIsAuthenticated] = useState(false);
return { isAuthenticated, setIsAuthenticated };
};
```
3. Component Composition & Separation of Concerns:
• Separate presentational (stateless) components from container components managing logic.
- **Example**: Use `LayoutGuest` to encapsulate common page structures.
4. Code Quality & Readability:
• Maintain consistent formatting and adhere to Prettier and ESLint rules.
• Use descriptive names for variables, functions, and components.
• Document non-trivial logic with inline comments and consider implementing error boundaries where needed.
• New code must adhere to these conventions to avoid ambiguity.
• Use descriptive names that reflect the purpose and domain, avoiding abbreviations unless standard in the project.
## Group 4 Backend & API Guidelines
1. API Endpoint Design & Documentation:
• Follow RESTful naming conventions; all route handlers should be named clearly and consistently.
- **Example**: Use verbs like `GET`, `POST`, `PUT`, `DELETE` to define actions, e.g., `GET /api/auth/me` to retrieve user info.
• Document endpoints with Swagger annotations to provide descriptions, expected request bodies, and response codes.
- **Example**:
```javascript
/**
* @swagger
* /api/auth/signin:
* post:
* summary: Sign in a user
* requestBody:
* description: User credentials
* content:
* application/json:
* schema:
* $ref: "#/components/schemas/Auth"
* responses:
* 200:
* description: Successful login
* 400:
* description: Invalid username/password supplied
*/
```
• Examples (for Auth endpoints):
- POST /api/auth/signin/local
• Description: Logs the user into the system.
• Request Body (application/json):
{ "email": "admin@flatlogic.com", "password": "password" }
• Responses:
- 200: Successful login (returns token and user data).
- 400: Invalid username/password supplied.
- GET /api/auth/me
• Description: Retrieves current authorized user information.
• Secured via Passport JWT; uses req.currentUser.
• Responses:
- 200: Returns current user info.
- 400: Invalid credentials or missing user data.
- POST /api/auth/signup
• Description: Registers a new user.
• Request Body (application/json):
{ "email": "admin@flatlogic.com", "password": "password" }
• Responses:
- 200: New user signed up successfully.
- 400: Invalid input supplied.
- 500: Server error.
## Group 5 Testing, Quality Assurance & Error Handling
1. Testing Guidelines:
• Write unit tests for critical backend and frontend components using frameworks such as Jest, React Testing Library, and Mocha/Chai.
- **Example**:
```javascript
test('should return user data', async () => {
const user = await getUserData();
expect(user).toHaveProperty('email');
});
```
• Practice test-driven development and maintain high test coverage.
• Regularly update tests following changes in business logic.
2. Quality Assurance:
• Enforce code quality with ESLint, Prettier, and static analysis tools.
• Integrate continuous testing workflows (CI/CD) to catch issues early.
- **Example**: Use GitHub Actions for automated testing and deployment.
• Ensure documentation is kept up-to-date with the implemented code.
3. Error Handling:
• Back-end:
- Wrap asynchronous route handlers with a helper (e.g., wrapAsync) to capture errors.
- **Example**:
```javascript
router.post('/signin', wrapAsync(async (req, res) => {
const user = await AuthService.signin(req.body);
res.send(user);
}));
```
- Use centralized error handling middleware (e.g., commonErrorHandler) for uniform error responses.
• Front-end:
- Implement error boundaries in React to gracefully handle runtime errors.
- Display user-friendly error messages and log errors for further analysis.
2. Authentication & Security:
• Protect endpoints by using Passport.js with JWT (e.g., passport.authenticate('jwt', { session: false })).
- **Example**:
```javascript
router.get('/profile', passport.authenticate('jwt', { session: false }), (req, res) => {
res.send(req.user);
});
```
• Ensure that secure routes check for existence of req.currentUser. If absent, return a ForbiddenError.
3. Consistent Error Handling & Middleware Usage:
• Wrap asynchronous route handlers with helpers like wrapAsync for error propagation.
• Use centralized error handling middleware (e.g., commonErrorHandler) to capture and format errors uniformly.
4. Modular Code Organization:
• Organize backend code into separate files for routes, services, and database access (e.g., auth.js, projects.js, tasks.js).
• Use descriptive, lowercase filenames for modules and routes.
5. Endpoint Security Best Practices:
• Validate input data and sanitize requests where necessary.
• Restrict sensitive operations to authenticated users with proper role-based permissions.
────────────────────────────────────────
Group 6 Accessibility, UI, and Styling Guidelines (Updated)
────────────────────────────────────────
1. Sidebar Styling:
• The sidebar is implemented in the authenticated layout via the AsideMenu component, with the actual element defined in AsideMenuLayer (located at frontend/src/components/AsideMenuLayer.tsx) as an <aside> element with id="asideMenu".
- **Example**:
```css
#asideMenu {
background-color: #F8F4E1 !important;
}
```
• When modifying sidebar styles, target #asideMenu and its child elements rather than generic selectors (e.g., avoid .app-sidebar) to ensure that the changes affect the actual rendered sidebar.
• Remove or override any conflicting background utilities (such as an unwanted bg-white) so our desired background color (#F8F4E1) is fully visible. Use a highly specific selector if necessary.
• Adjust spacing (padding/margins) at both the container (#asideMenu) and the individual menu item level to maintain a consistent, compact design.
2. General Project Styling and Tailwind CSS Usage:
• The application leverages Tailwind CSS extensively, with core styling defined in _theme.css using the @apply directive. Any new modifications should follow this pattern to ensure consistency.
- **Example**:
```css
.btn {
@apply bg-blue-500 text-white;
}
```
• The themed blocks (like .theme-pink and .theme-green) standardize the UI's appearance. When applying custom overrides, ensure they integrate cleanly into these structures and avoid conflicts or circular dependency errors (e.g., issues when redefining utilities such as text-blue-600).
• Adjustments via Tailwind CSS generally require modifying class names in the components and ensuring that global overrides are applied in the correct order. Consistent use of design tokens and custom color codes (e.g., #F8F4E1) throughout the app is crucial to a cohesive design.
• Specificity is key. If a change isn't visually reflected as expected, inspect the rendered HTML to identify which classes are taking precedence.

3
.dockerignore Normal file
View File

@ -0,0 +1,3 @@
backend/node_modules
frontend/node_modules
frontend/build

3
.gitignore vendored Normal file
View File

@ -0,0 +1,3 @@
node_modules/
*/node_modules/
*/build/

187
502.html Normal file
View File

@ -0,0 +1,187 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Service Starting</title>
<style>
body {
font-family: sans-serif;
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
min-height: 100vh;
background-color: #EFF2FF;
margin: 0;
padding: 20px;
}
.container {
text-align: center;
padding: 30px 40px;
background-color: #fff;
border-radius: 20px;
margin-bottom: 20px;
max-width: 538px;
width: 100%;
box-shadow: 0 13px 34px 0 rgba(167, 187, 242, 0.2);
box-sizing: border-box;
}
#status-heading {
font-size: 24px;
font-weight: 700;
color: #02004E;
margin-bottom: 20px;
}
h2 {
color: #333;
margin-bottom: 15px;
}
p {
color: #666;
font-size: 1.1em;
margin-bottom: 10px;
}
.tip {
font-weight: 300;
font-size: 17px;
line-height: 150%;
letter-spacing: 0;
text-align: center;
margin-top: 30px;
}
.loader-container {
position: relative;
display: flex;
justify-content: center;
align-items: center;
}
.loader {
width: 100px;
aspect-ratio: 1;
border-radius: 50%;
background:
radial-gradient(farthest-side, #5C7EF1 94%, #0000) top/8px 8px no-repeat,
conic-gradient(#0000 30%, #5C7EF1);
-webkit-mask: radial-gradient(farthest-side, #0000 calc(100% - 8px), #000 0);
animation: l13 2s infinite linear;
}
@keyframes l13 {
100% {
transform: rotate(1turn)
}
}
.app-logo {
position: absolute;
width: 36px;
}
.panel {
padding: 0 18px;
display: none;
background-color: white;
overflow: hidden;
margin-top: 10px;
}
.show {
display: block;
}
.project-info {
border: 1px solid #8C9DFF;
border-radius: 10px;
padding: 12px 16px;
max-width: 600px;
margin: 40px auto;
background-color: #FBFCFF;
}
.project-info h2 {
color: #02004E;
font-size: 14px;
font-weight: 500;
margin-bottom: 10px;
text-align: left;
}
.project-info p {
color: #686791;
font-size: 12px;
font-weight: 400;
text-align: left;
}
</style>
</head>
<body>
<div class="container">
<h2 id="status-heading">Loading the app, just a moment…</h2>
<p class="tip">The application is currently launching. The page will automatically refresh once site is
available.</p>
<div class="project-info">
<h2>Limmu Gennet School Website</h2>
<p>Tri-language public-first school website with student and staff portals for grades, attendance, news, events, and resources.</p>
</div>
<div class="loader-container">
<img src="https://flatlogic.com/blog/wp-content/uploads/2025/05/logo-bot-1.png" alt="App Logo"
class="app-logo">
<div class="loader"></div>
</div>
<div class="panel">
<video width="100%" height="315" controls loop>
<source
src="https://flatlogic.com/blog/wp-content/uploads/2025/04/20250430_1336_professional_dynamo_spinner_simple_compose_01jt349yvtenxt7xhg8hhr85j8.mp4"
type="video/mp4">
Your browser does not support the video tag.
</video>
</div>
</div>
<script>
function checkAvailability() {
fetch('/')
.then(response => {
if (response.ok) {
window.location.reload();
} else {
setTimeout(checkAvailability, 5000);
}
})
.catch(() => {
setTimeout(checkAvailability, 5000);
});
}
document.addEventListener('DOMContentLoaded', checkAvailability);
document.addEventListener('DOMContentLoaded', function () {
const appTitle = document.querySelector('#status-heading');
const panel = document.querySelector('.panel');
const video = panel.querySelector('video');
let clickCount = 0;
appTitle.addEventListener('click', function () {
clickCount++;
if (clickCount === 5) {
panel.classList.toggle('show');
if (panel.classList.contains('show')) {
video.play();
} else {
video.pause();
}
clickCount = 0;
}
});
});
</script>
</body>
</html>

21
Dockerfile Normal file
View File

@ -0,0 +1,21 @@
FROM node:20.15.1-alpine AS builder
RUN apk add --no-cache git
WORKDIR /app
COPY frontend/package.json frontend/yarn.lock ./
RUN yarn install --pure-lockfile
COPY frontend .
RUN yarn build
FROM node:20.15.1-alpine
WORKDIR /app
COPY backend/package.json backend/yarn.lock ./
RUN yarn install --pure-lockfile
COPY backend .
COPY --from=builder /app/build /app/public
CMD ["yarn", "start"]

85
Dockerfile.dev Normal file
View File

@ -0,0 +1,85 @@
# Base image for Node.js dependencies
FROM node:20.15.1-alpine AS frontend-deps
RUN apk add --no-cache git
WORKDIR /app/frontend
COPY frontend/package.json frontend/yarn.lock ./
RUN yarn install --pure-lockfile
FROM node:20.15.1-alpine AS backend-deps
RUN apk add --no-cache git
WORKDIR /app/backend
COPY backend/package.json backend/yarn.lock ./
RUN yarn install --pure-lockfile
FROM node:20.15.1-alpine AS app-shell-deps
RUN apk add --no-cache git
WORKDIR /app/app-shell
COPY app-shell/package.json app-shell/yarn.lock ./
RUN yarn install --pure-lockfile
# Nginx setup and application build
FROM node:20.15.1-alpine AS build
RUN apk add --no-cache git nginx curl
RUN apk add --no-cache lsof procps
RUN yarn global add concurrently
RUN apk add --no-cache \
chromium \
nss \
freetype \
harfbuzz \
ttf-freefont \
fontconfig
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium-browser
RUN mkdir -p /app/pids
# Make sure to add yarn global bin to PATH
ENV PATH /root/.yarn/bin:/root/.config/yarn/global/node_modules/.bin:$PATH
# Copy dependencies
WORKDIR /app
COPY --from=frontend-deps /app/frontend /app/frontend
COPY --from=backend-deps /app/backend /app/backend
COPY --from=app-shell-deps /app/app-shell /app/app-shell
COPY frontend /app/frontend
COPY backend /app/backend
COPY app-shell /app/app-shell
COPY docker /app/docker
# Copy all files from root to /app
COPY . /app
# Copy Nginx configuration
COPY nginx.conf /etc/nginx/nginx.conf
# Copy custom error page
COPY 502.html /usr/share/nginx/html/502.html
# Change owner and permissions of the error page
RUN chown nginx:nginx /usr/share/nginx/html/502.html && \
chmod 644 /usr/share/nginx/html/502.html
# Expose the port the app runs on
EXPOSE 8080
ENV NODE_ENV=dev_stage
ENV FRONT_PORT=3001
ENV BACKEND_PORT=3000
ENV APP_SHELL_PORT=4000
CMD ["sh", "-c", "\
yarn --cwd /app/frontend dev & echo $! > /app/pids/frontend.pid && \
yarn --cwd /app/backend start & echo $! > /app/pids/backend.pid && \
sleep 10 && nginx -g 'daemon off;' & \
NGINX_PID=$! && \
echo 'Waiting for backend (port 3000) to be available...' && \
while ! nc -z localhost ${BACKEND_PORT}; do \
sleep 2; \
done && \
echo 'Backend is up. Starting app_shell for Git check...' && \
yarn --cwd /app/app-shell start && \
wait $NGINX_PID"]

1
LICENSE Normal file
View File

@ -0,0 +1 @@
https://flatlogic.com/

244
README.md Normal file
View File

@ -0,0 +1,244 @@
# Limmu Gennet School Website
## This project was generated by [Flatlogic Platform](https://flatlogic.com).
- Frontend: [React.js](https://flatlogic.com/templates?framework%5B%5D=react&sort=default)
- Backend: [NodeJS](https://flatlogic.com/templates?backend%5B%5D=nodejs&sort=default)
<details><summary>Backend Folder Structure</summary>
The generated application has the following backend folder structure:
`src` folder which contains your working files that will be used later to create the build. The src folder contains folders as:
- `auth` - config the library for authentication and authorization;
- `db` - contains such folders as:
- `api` - documentation that is automatically generated by jsdoc or other tools;
- `migrations` - is a skeleton of the database or all the actions that users do with the database;
- `models`- what will represent the database for the backend;
- `seeders` - the entity that creates the data for the database.
- `routes` - this folder would contain all the routes that you have created using Express Router and what they do would be exported from a Controller file;
- `services` - contains such folders as `emails` and `notifications`.
</details>
- Database: PostgreSQL
- app-shel: Core application framework that provides essential infrastructure services
for the entire application.
-----------------------
### We offer 2 ways how to start the project locally: by running Frontend and Backend or with Docker.
-----------------------
## To start the project:
### Backend:
> Please change current folder: `cd backend`
#### Install local dependencies:
`yarn install`
------------
#### Adjust local db:
##### 1. Install postgres:
MacOS:
`brew install postgres`
> if you dont have brew please install it (https://brew.sh) and repeat step `brew install postgres`.
Ubuntu:
`sudo apt update`
`sudo apt install postgresql postgresql-contrib`
##### 2. Create db and admin user:
Before run and test connection, make sure you have created a database as described in the above configuration. You can use the `psql` command to create a user and database.
`psql postgres --u postgres`
Next, type this command for creating a new user with password then give access for creating the database.
`postgres-# CREATE ROLE admin WITH LOGIN PASSWORD 'admin_pass';`
`postgres-# ALTER ROLE admin CREATEDB;`
Quit `psql` then log in again using the new user that previously created.
`postgres-# \q`
`psql postgres -U admin`
Type this command to creating a new database.
`postgres=> CREATE DATABASE db_{your_project_name};`
Then give that new user privileges to the new database then quit the `psql`.
`postgres=> GRANT ALL PRIVILEGES ON DATABASE db_{your_project_name} TO admin;`
`postgres=> \q`
------------
#### Create database:
`yarn db:create`
#### Start production build:
`yarn start`
### Frontend:
> Please change current folder: `cd frontend`
## To start the project with Docker:
### Description:
The project contains the **docker folder** and the `Dockerfile`.
The `Dockerfile` is used to Deploy the project to Google Cloud.
The **docker folder** contains a couple of helper scripts:
- `docker-compose.yml` (all our services: web, backend, db are described here)
- `start-backend.sh` (starts backend, but only after the database)
- `wait-for-it.sh` (imported from https://github.com/vishnubob/wait-for-it)
> To avoid breaking the application, we recommend you don't edit the following files: everything that includes the **docker folder** and `Dokerfile`.
## Run services:
1. Install docker compose (https://docs.docker.com/compose/install/)
2. Move to `docker` folder. All next steps should be done from this folder.
``` cd docker ```
3. Make executables from `wait-for-it.sh` and `start-backend.sh`:
``` chmod +x start-backend.sh && chmod +x wait-for-it.sh ```
4. Download dependend projects for services.
5. Review the docker-compose.yml file. Make sure that all services have Dockerfiles. Only db service doesn't require a Dockerfile.
6. Make sure you have needed ports (see them in `ports`) available on your local machine.
7. Start services:
7.1. With an empty database `rm -rf data && docker-compose up`
7.2. With a stored (from previus runs) database data `docker-compose up`
8. Check http://localhost:3000
9. Stop services:
9.1. Just press `Ctr+C`
## Most common errors:
1. `connection refused`
There could be many reasons, but the most common are:
- The port is not open on the destination machine.
- The port is open on the destination machine, but its backlog of pending connections is full.
- A firewall between the client and server is blocking access (also check local firewalls).
After checking for firewalls and that the port is open, use telnet to connect to the IP/port to test connectivity. This removes any potential issues from your application.
***MacOS:***
If you suspect that your SSH service might be down, you can run this command to find out:
`sudo service ssh status`
If the command line returns a status of down, then youve likely found the reason behind your connectivity error.
***Ubuntu:***
Sometimes a connection refused error can also indicate that there is an IP address conflict on your network. You can search for possible IP conflicts by running:
`arp-scan -I eth0 -l | grep <ipaddress>`
`arp-scan -I eth0 -l | grep <ipaddress>`
and
`arping <ipaddress>`
2. `yarn db:create` creates database with the assembled tables (on MacOS with Postgres database)
The workaround - put the next commands to your Postgres database terminal:
`DROP SCHEMA public CASCADE;`
`CREATE SCHEMA public;`
`GRANT ALL ON SCHEMA public TO postgres;`
`GRANT ALL ON SCHEMA public TO public;`
Afterwards, continue to start your project in the backend directory by running:
`yarn start`

14
backend/.env Normal file
View File

@ -0,0 +1,14 @@
DB_NAME=app_38407
DB_USER=app_38407
DB_PASS=561da65b-ab63-42f5-8cdb-96c678795cb3
DB_HOST=127.0.0.1
DB_PORT=5432
PORT=3000
GOOGLE_CLIENT_ID=671001533244-kf1k1gmp6mnl0r030qmvdu6v36ghmim6.apps.googleusercontent.com
GOOGLE_CLIENT_SECRET=Yo4qbKZniqvojzUQ60iKlxqR
MS_CLIENT_ID=4696f457-31af-40de-897c-e00d7d4cff73
MS_CLIENT_SECRET=m8jzZ.5UpHF3=-dXzyxiZ4e[F8OF54@p
EMAIL_USER=AKIAVEW7G4PQUBGM52OF
EMAIL_PASS=BLnD4hKGb6YkSz3gaQrf8fnyLi3C3/EdjOOsLEDTDPTz
SECRET_KEY=HUEyqESqgQ1yTwzVlO6wprC9Kf1J1xuA
PEXELS_KEY=Vc99rnmOhHhJAbgGQoKLZtsaIVfkeownoQNbTj78VemUjKh08ZYRbf18

4
backend/.eslintignore Normal file
View File

@ -0,0 +1,4 @@
# Ignore generated and runtime files
node_modules/
tmp/
logs/

15
backend/.eslintrc.cjs Normal file
View File

@ -0,0 +1,15 @@
module.exports = {
env: {
node: true,
es2021: true
},
extends: [
'eslint:recommended'
],
plugins: [
'import'
],
rules: {
'import/no-unresolved': 'error'
}
};

11
backend/.prettierrc Normal file
View File

@ -0,0 +1,11 @@
{
"singleQuote": true,
"tabWidth": 2,
"printWidth": 80,
"trailingComma": "all",
"quoteProps": "as-needed",
"jsxSingleQuote": true,
"bracketSpacing": true,
"bracketSameLine": false,
"arrowParens": "always"
}

7
backend/.sequelizerc Normal file
View File

@ -0,0 +1,7 @@
const path = require('path');
module.exports = {
"config": path.resolve("src", "db", "db.config.js"),
"models-path": path.resolve("src", "db", "models"),
"seeders-path": path.resolve("src", "db", "seeders"),
"migrations-path": path.resolve("src", "db", "migrations")
};

23
backend/Dockerfile Normal file
View File

@ -0,0 +1,23 @@
FROM node:20.15.1-alpine
RUN apk update && apk add bash
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm@5+)
COPY package*.json ./
RUN yarn install
# If you are building your code for production
# RUN npm ci --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "yarn", "start" ]

56
backend/README.md Normal file
View File

@ -0,0 +1,56 @@
#Limmu Gennet School Website - template backend,
#### Run App on local machine:
##### Install local dependencies:
- `yarn install`
------------
##### Adjust local db:
###### 1. Install postgres:
- MacOS:
- `brew install postgres`
- Ubuntu:
- `sudo apt update`
- `sudo apt install postgresql postgresql-contrib`
###### 2. Create db and admin user:
- Before run and test connection, make sure you have created a database as described in the above configuration. You can use the `psql` command to create a user and database.
- `psql postgres --u postgres`
- Next, type this command for creating a new user with password then give access for creating the database.
- `postgres-# CREATE ROLE admin WITH LOGIN PASSWORD 'admin_pass';`
- `postgres-# ALTER ROLE admin CREATEDB;`
- Quit `psql` then log in again using the new user that previously created.
- `postgres-# \q`
- `psql postgres -U admin`
- Type this command to creating a new database.
- `postgres=> CREATE DATABASE db_limmu_gennet_school_website;`
- Then give that new user privileges to the new database then quit the `psql`.
- `postgres=> GRANT ALL PRIVILEGES ON DATABASE db_limmu_gennet_school_website TO admin;`
- `postgres=> \q`
------------
#### Api Documentation (Swagger)
http://localhost:8080/api-docs (local host)
http://host_name/api-docs
------------
##### Setup database tables or update after schema change
- `yarn db:migrate`
##### Seed the initial data (admin accounts, relevant for the first setup):
- `yarn db:seed`
##### Start build:
- `yarn start`

56
backend/package.json Normal file
View File

@ -0,0 +1,56 @@
{
"name": "limmugennetschoolwebsite",
"description": "Limmu Gennet School Website - template backend",
"scripts": {
"start": "npm run db:migrate && npm run db:seed && npm run watch",
"lint": "eslint . --ext .js",
"db:migrate": "sequelize-cli db:migrate",
"db:seed": "sequelize-cli db:seed:all",
"db:drop": "sequelize-cli db:drop",
"db:create": "sequelize-cli db:create",
"watch": "node watcher.js"
},
"dependencies": {
"@google-cloud/storage": "^5.18.2",
"axios": "^1.6.7",
"bcrypt": "5.1.1",
"chokidar": "^4.0.3",
"cors": "2.8.5",
"csv-parser": "^3.0.0",
"express": "4.18.2",
"formidable": "1.2.2",
"helmet": "4.1.1",
"json2csv": "^5.0.7",
"jsonwebtoken": "8.5.1",
"lodash": "4.17.21",
"moment": "2.30.1",
"multer": "^1.4.4",
"mysql2": "2.2.5",
"nodemailer": "6.9.9",
"passport": "^0.7.0",
"passport-google-oauth2": "^0.2.0",
"passport-jwt": "^4.0.1",
"passport-microsoft": "^0.1.0",
"pg": "8.4.1",
"pg-hstore": "2.3.4",
"sequelize": "6.35.2",
"sequelize-json-schema": "^2.1.1",
"sqlite": "4.0.15",
"swagger-jsdoc": "^6.2.8",
"swagger-ui-express": "^5.0.0",
"tedious": "^18.2.4"
},
"engines": {
"node": ">=18"
},
"private": true,
"devDependencies": {
"cross-env": "7.0.3",
"eslint": "^8.23.1",
"eslint-plugin-import": "^2.29.1",
"mocha": "8.1.3",
"node-mocks-http": "1.9.0",
"nodemon": "2.0.5",
"sequelize-cli": "6.6.2"
}
}

View File

@ -0,0 +1,484 @@
"use strict";
const fs = require("fs");
const path = require("path");
const http = require("http");
const https = require("https");
const { URL } = require("url");
let CONFIG_CACHE = null;
class LocalAIApi {
static createResponse(params, options) {
return createResponse(params, options);
}
static request(pathValue, payload, options) {
return request(pathValue, payload, options);
}
static fetchStatus(aiRequestId, options) {
return fetchStatus(aiRequestId, options);
}
static awaitResponse(aiRequestId, options) {
return awaitResponse(aiRequestId, options);
}
static extractText(response) {
return extractText(response);
}
static decodeJsonFromResponse(response) {
return decodeJsonFromResponse(response);
}
}
async function createResponse(params, options = {}) {
const payload = { ...(params || {}) };
if (!Array.isArray(payload.input) || payload.input.length === 0) {
return {
success: false,
error: "input_missing",
message: 'Parameter "input" is required and must be a non-empty array.',
};
}
const cfg = config();
if (!payload.model) {
payload.model = cfg.defaultModel;
}
const initial = await request(options.path, payload, options);
if (!initial.success) {
return initial;
}
const data = initial.data;
if (data && typeof data === "object" && data.ai_request_id) {
const pollTimeout = Number(options.poll_timeout ?? 300);
const pollInterval = Number(options.poll_interval ?? 5);
return await awaitResponse(data.ai_request_id, {
interval: pollInterval,
timeout: pollTimeout,
headers: options.headers,
timeout_per_call: options.timeout,
verify_tls: options.verify_tls,
});
}
return initial;
}
async function request(pathValue, payload = {}, options = {}) {
const cfg = config();
const resolvedPath = pathValue || options.path || cfg.responsesPath;
if (!resolvedPath) {
return {
success: false,
error: "project_id_missing",
message: "PROJECT_ID is not defined; cannot resolve AI proxy endpoint.",
};
}
if (!cfg.projectUuid) {
return {
success: false,
error: "project_uuid_missing",
message: "PROJECT_UUID is not defined; aborting AI request.",
};
}
const bodyPayload = { ...(payload || {}) };
if (!bodyPayload.project_uuid) {
bodyPayload.project_uuid = cfg.projectUuid;
}
const url = buildUrl(resolvedPath, cfg.baseUrl);
const timeout = resolveTimeout(options.timeout, cfg.timeout);
const verifyTls = resolveVerifyTls(options.verify_tls, cfg.verifyTls);
const headers = {
Accept: "application/json",
"Content-Type": "application/json",
[cfg.projectHeader]: cfg.projectUuid,
};
if (Array.isArray(options.headers)) {
for (const header of options.headers) {
if (typeof header === "string" && header.includes(":")) {
const [name, value] = header.split(":", 2);
headers[name.trim()] = value.trim();
}
}
}
const body = JSON.stringify(bodyPayload);
return sendRequest(url, "POST", body, headers, timeout, verifyTls);
}
async function fetchStatus(aiRequestId, options = {}) {
const cfg = config();
if (!cfg.projectUuid) {
return {
success: false,
error: "project_uuid_missing",
message: "PROJECT_UUID is not defined; aborting status check.",
};
}
const statusPath = resolveStatusPath(aiRequestId, cfg);
const url = buildUrl(statusPath, cfg.baseUrl);
const timeout = resolveTimeout(options.timeout, cfg.timeout);
const verifyTls = resolveVerifyTls(options.verify_tls, cfg.verifyTls);
const headers = {
Accept: "application/json",
[cfg.projectHeader]: cfg.projectUuid,
};
if (Array.isArray(options.headers)) {
for (const header of options.headers) {
if (typeof header === "string" && header.includes(":")) {
const [name, value] = header.split(":", 2);
headers[name.trim()] = value.trim();
}
}
}
return sendRequest(url, "GET", null, headers, timeout, verifyTls);
}
async function awaitResponse(aiRequestId, options = {}) {
const timeout = Number(options.timeout ?? 300);
const interval = Math.max(Number(options.interval ?? 5), 1);
const deadline = Date.now() + Math.max(timeout, interval) * 1000;
while (true) {
const statusResp = await fetchStatus(aiRequestId, {
headers: options.headers,
timeout: options.timeout_per_call,
verify_tls: options.verify_tls,
});
if (statusResp.success) {
const data = statusResp.data || {};
if (data && typeof data === "object") {
if (data.status === "success") {
return {
success: true,
status: 200,
data: data.response || data,
};
}
if (data.status === "failed") {
return {
success: false,
status: 500,
error: String(data.error || "AI request failed"),
data,
};
}
}
} else {
return statusResp;
}
if (Date.now() >= deadline) {
return {
success: false,
error: "timeout",
message: "Timed out waiting for AI response.",
};
}
await sleep(interval * 1000);
}
}
function extractText(response) {
const payload = response && typeof response === "object" ? response.data || response : null;
if (!payload || typeof payload !== "object") {
return "";
}
if (Array.isArray(payload.output)) {
let combined = "";
for (const item of payload.output) {
if (!item || !Array.isArray(item.content)) {
continue;
}
for (const block of item.content) {
if (
block &&
typeof block === "object" &&
block.type === "output_text" &&
typeof block.text === "string" &&
block.text.length > 0
) {
combined += block.text;
}
}
}
if (combined) {
return combined;
}
}
if (
payload.choices &&
payload.choices[0] &&
payload.choices[0].message &&
typeof payload.choices[0].message.content === "string"
) {
return payload.choices[0].message.content;
}
return "";
}
function decodeJsonFromResponse(response) {
const text = extractText(response);
if (!text) {
throw new Error("No text found in AI response.");
}
const parsed = parseJson(text);
if (parsed.ok && parsed.value && typeof parsed.value === "object") {
return parsed.value;
}
const stripped = stripJsonFence(text);
if (stripped !== text) {
const parsedStripped = parseJson(stripped);
if (parsedStripped.ok && parsedStripped.value && typeof parsedStripped.value === "object") {
return parsedStripped.value;
}
throw new Error(`JSON parse failed after stripping fences: ${parsedStripped.error}`);
}
throw new Error(`JSON parse failed: ${parsed.error}`);
}
function config() {
if (CONFIG_CACHE) {
return CONFIG_CACHE;
}
ensureEnvLoaded();
const baseUrl = process.env.AI_PROXY_BASE_URL || "https://flatlogic.com";
const projectId = process.env.PROJECT_ID || null;
let responsesPath = process.env.AI_RESPONSES_PATH || null;
if (!responsesPath && projectId) {
responsesPath = `/projects/${projectId}/ai-request`;
}
const timeout = resolveTimeout(process.env.AI_TIMEOUT, 30);
const verifyTls = resolveVerifyTls(process.env.AI_VERIFY_TLS, true);
CONFIG_CACHE = {
baseUrl,
responsesPath,
projectId,
projectUuid: process.env.PROJECT_UUID || null,
projectHeader: process.env.AI_PROJECT_HEADER || "project-uuid",
defaultModel: process.env.AI_DEFAULT_MODEL || "gpt-5-mini",
timeout,
verifyTls,
};
return CONFIG_CACHE;
}
function buildUrl(pathValue, baseUrl) {
const trimmed = String(pathValue || "").trim();
if (trimmed === "") {
return baseUrl;
}
if (trimmed.startsWith("http://") || trimmed.startsWith("https://")) {
return trimmed;
}
if (trimmed.startsWith("/")) {
return `${baseUrl}${trimmed}`;
}
return `${baseUrl}/${trimmed}`;
}
function resolveStatusPath(aiRequestId, cfg) {
const basePath = (cfg.responsesPath || "").replace(/\/+$/, "");
if (!basePath) {
return `/ai-request/${encodeURIComponent(String(aiRequestId))}/status`;
}
const normalized = basePath.endsWith("/ai-request") ? basePath : `${basePath}/ai-request`;
return `${normalized}/${encodeURIComponent(String(aiRequestId))}/status`;
}
function sendRequest(urlString, method, body, headers, timeoutSeconds, verifyTls) {
return new Promise((resolve) => {
let targetUrl;
try {
targetUrl = new URL(urlString);
} catch (err) {
resolve({
success: false,
error: "invalid_url",
message: err.message,
});
return;
}
const isHttps = targetUrl.protocol === "https:";
const requestFn = isHttps ? https.request : http.request;
const options = {
protocol: targetUrl.protocol,
hostname: targetUrl.hostname,
port: targetUrl.port || (isHttps ? 443 : 80),
path: `${targetUrl.pathname}${targetUrl.search}`,
method: method.toUpperCase(),
headers,
timeout: Math.max(Number(timeoutSeconds || 30), 1) * 1000,
};
if (isHttps) {
options.rejectUnauthorized = Boolean(verifyTls);
}
const req = requestFn(options, (res) => {
let responseBody = "";
res.setEncoding("utf8");
res.on("data", (chunk) => {
responseBody += chunk;
});
res.on("end", () => {
const status = res.statusCode || 0;
const parsed = parseJson(responseBody);
const payload = parsed.ok ? parsed.value : responseBody;
if (status >= 200 && status < 300) {
const result = {
success: true,
status,
data: payload,
};
if (!parsed.ok) {
result.json_error = parsed.error;
}
resolve(result);
return;
}
const errorMessage =
parsed.ok && payload && typeof payload === "object"
? String(payload.error || payload.message || "AI proxy request failed")
: String(responseBody || "AI proxy request failed");
resolve({
success: false,
status,
error: errorMessage,
response: payload,
json_error: parsed.ok ? undefined : parsed.error,
});
});
});
req.on("timeout", () => {
req.destroy(new Error("request_timeout"));
});
req.on("error", (err) => {
resolve({
success: false,
error: "request_failed",
message: err.message,
});
});
if (body) {
req.write(body);
}
req.end();
});
}
function parseJson(value) {
if (typeof value !== "string" || value.trim() === "") {
return { ok: false, error: "empty_response" };
}
try {
return { ok: true, value: JSON.parse(value) };
} catch (err) {
return { ok: false, error: err.message };
}
}
function stripJsonFence(text) {
const trimmed = text.trim();
if (trimmed.startsWith("```json")) {
return trimmed.replace(/^```json/, "").replace(/```$/, "").trim();
}
if (trimmed.startsWith("```")) {
return trimmed.replace(/^```/, "").replace(/```$/, "").trim();
}
return text;
}
function resolveTimeout(value, fallback) {
const parsed = Number.parseInt(String(value ?? fallback), 10);
return Number.isNaN(parsed) ? Number(fallback) : parsed;
}
function resolveVerifyTls(value, fallback) {
if (value === undefined || value === null) {
return Boolean(fallback);
}
return String(value).toLowerCase() !== "false" && String(value) !== "0";
}
function ensureEnvLoaded() {
if (process.env.PROJECT_UUID && process.env.PROJECT_ID) {
return;
}
const envPath = path.resolve(__dirname, "../../../../.env");
if (!fs.existsSync(envPath)) {
return;
}
let content;
try {
content = fs.readFileSync(envPath, "utf8");
} catch (err) {
throw new Error(`Failed to read executor .env: ${err.message}`);
}
for (const line of content.split(/\r?\n/)) {
const trimmed = line.trim();
if (!trimmed || trimmed.startsWith("#") || !trimmed.includes("=")) {
continue;
}
const [rawKey, ...rest] = trimmed.split("=");
const key = rawKey.trim();
if (!key) {
continue;
}
const value = rest.join("=").trim().replace(/^['"]|['"]$/g, "");
if (!process.env[key]) {
process.env[key] = value;
}
}
}
function sleep(ms) {
return new Promise((resolve) => setTimeout(resolve, ms));
}
module.exports = {
LocalAIApi,
createResponse,
request,
fetchStatus,
awaitResponse,
extractText,
decodeJsonFromResponse,
};

68
backend/src/auth/auth.js Normal file
View File

@ -0,0 +1,68 @@
const config = require('../config');
const providers = config.providers;
const helpers = require('../helpers');
const db = require('../db/models');
const passport = require('passport');
const JWTstrategy = require('passport-jwt').Strategy;
const ExtractJWT = require('passport-jwt').ExtractJwt;
const GoogleStrategy = require('passport-google-oauth2').Strategy;
const MicrosoftStrategy = require('passport-microsoft').Strategy;
const UsersDBApi = require('../db/api/users');
passport.use(new JWTstrategy({
passReqToCallback: true,
secretOrKey: config.secret_key,
jwtFromRequest: ExtractJWT.fromAuthHeaderAsBearerToken()
}, async (req, token, done) => {
try {
const user = await UsersDBApi.findBy( {email: token.user.email});
if (user && user.disabled) {
return done (new Error(`User '${user.email}' is disabled`));
}
req.currentUser = user;
return done(null, user);
} catch (error) {
done(error);
}
}));
passport.use(new GoogleStrategy({
clientID: config.google.clientId,
clientSecret: config.google.clientSecret,
callbackURL: config.apiUrl + '/auth/signin/google/callback',
passReqToCallback: true
},
function (request, accessToken, refreshToken, profile, done) {
socialStrategy(profile.email, profile, providers.GOOGLE, done);
}
));
passport.use(new MicrosoftStrategy({
clientID: config.microsoft.clientId,
clientSecret: config.microsoft.clientSecret,
callbackURL: config.apiUrl + '/auth/signin/microsoft/callback',
passReqToCallback: true
},
function (request, accessToken, refreshToken, profile, done) {
const email = profile._json.mail || profile._json.userPrincipalName;
socialStrategy(email, profile, providers.MICROSOFT, done);
}
));
function socialStrategy(email, profile, provider, done) {
db.users.findOrCreate({where: {email, provider}}).then(([user, created]) => {
const body = {
id: user.id,
email: user.email,
name: profile.displayName,
};
const token = helpers.jwtSign({user: body});
return done(null, {token});
});
}

79
backend/src/config.js Normal file
View File

@ -0,0 +1,79 @@
const os = require('os');
const config = {
gcloud: {
bucket: "fldemo-files",
hash: "afeefb9d49f5b7977577876b99532ac7"
},
bcrypt: {
saltRounds: 12
},
admin_pass: "561da65b",
user_pass: "96c678795cb3",
admin_email: "admin@flatlogic.com",
providers: {
LOCAL: 'local',
GOOGLE: 'google',
MICROSOFT: 'microsoft'
},
secret_key: process.env.SECRET_KEY || '561da65b-ab63-42f5-8cdb-96c678795cb3',
remote: '',
port: process.env.NODE_ENV === "production" ? "" : "8080",
hostUI: process.env.NODE_ENV === "production" ? "" : "http://localhost",
portUI: process.env.NODE_ENV === "production" ? "" : "3000",
portUIProd: process.env.NODE_ENV === "production" ? "" : ":3000",
swaggerUI: process.env.NODE_ENV === "production" ? "" : "http://localhost",
swaggerPort: process.env.NODE_ENV === "production" ? "" : ":8080",
google: {
clientId: process.env.GOOGLE_CLIENT_ID || '',
clientSecret: process.env.GOOGLE_CLIENT_SECRET || '',
},
microsoft: {
clientId: process.env.MS_CLIENT_ID || '',
clientSecret: process.env.MS_CLIENT_SECRET || '',
},
uploadDir: os.tmpdir(),
email: {
from: 'Limmu Gennet School Website <app@flatlogic.app>',
host: 'email-smtp.us-east-1.amazonaws.com',
port: 587,
auth: {
user: process.env.EMAIL_USER || '',
pass: process.env.EMAIL_PASS,
},
tls: {
rejectUnauthorized: false
}
},
roles: {
admin: 'Administrator',
user: 'Student',
},
project_uuid: '561da65b-ab63-42f5-8cdb-96c678795cb3',
flHost: process.env.NODE_ENV === 'production' || process.env.NODE_ENV === 'dev_stage' ? 'https://flatlogic.com/projects' : 'http://localhost:3000/projects',
gpt_key: process.env.GPT_KEY || '',
};
config.pexelsKey = process.env.PEXELS_KEY || '';
config.pexelsQuery = 'Sunrise over open countryside';
config.host = process.env.NODE_ENV === "production" ? config.remote : "http://localhost";
config.apiUrl = `${config.host}${config.port ? `:${config.port}` : ``}/api`;
config.swaggerUrl = `${config.swaggerUI}${config.swaggerPort}`;
config.uiUrl = `${config.hostUI}${config.portUI ? `:${config.portUI}` : ``}/#`;
config.backUrl = `${config.hostUI}${config.portUI ? `:${config.portUI}` : ``}`;
module.exports = config;

View File

@ -0,0 +1,731 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Admission_applicationsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const admission_applications = await db.admission_applications.create(
{
id: data.id || undefined,
entry_type: data.entry_type
||
null
,
applicant_full_name: data.applicant_full_name
||
null
,
gender: data.gender
||
null
,
date_of_birth: data.date_of_birth
||
null
,
previous_school: data.previous_school
||
null
,
guardian_full_name: data.guardian_full_name
||
null
,
guardian_phone: data.guardian_phone
||
null
,
guardian_email: data.guardian_email
||
null
,
address_text: data.address_text
||
null
,
submitted_at: data.submitted_at
||
null
,
status: data.status
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await admission_applications.setRequested_grade( data.requested_grade || null, {
transaction,
});
await admission_applications.setRequested_stream( data.requested_stream || null, {
transaction,
});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.admission_applications.getTableName(),
belongsToColumn: 'documents',
belongsToId: admission_applications.id,
},
data.documents,
options,
);
return admission_applications;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const admission_applicationsData = data.map((item, index) => ({
id: item.id || undefined,
entry_type: item.entry_type
||
null
,
applicant_full_name: item.applicant_full_name
||
null
,
gender: item.gender
||
null
,
date_of_birth: item.date_of_birth
||
null
,
previous_school: item.previous_school
||
null
,
guardian_full_name: item.guardian_full_name
||
null
,
guardian_phone: item.guardian_phone
||
null
,
guardian_email: item.guardian_email
||
null
,
address_text: item.address_text
||
null
,
submitted_at: item.submitted_at
||
null
,
status: item.status
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const admission_applications = await db.admission_applications.bulkCreate(admission_applicationsData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < admission_applications.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.admission_applications.getTableName(),
belongsToColumn: 'documents',
belongsToId: admission_applications[i].id,
},
data[i].documents,
options,
);
}
return admission_applications;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const admission_applications = await db.admission_applications.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.entry_type !== undefined) updatePayload.entry_type = data.entry_type;
if (data.applicant_full_name !== undefined) updatePayload.applicant_full_name = data.applicant_full_name;
if (data.gender !== undefined) updatePayload.gender = data.gender;
if (data.date_of_birth !== undefined) updatePayload.date_of_birth = data.date_of_birth;
if (data.previous_school !== undefined) updatePayload.previous_school = data.previous_school;
if (data.guardian_full_name !== undefined) updatePayload.guardian_full_name = data.guardian_full_name;
if (data.guardian_phone !== undefined) updatePayload.guardian_phone = data.guardian_phone;
if (data.guardian_email !== undefined) updatePayload.guardian_email = data.guardian_email;
if (data.address_text !== undefined) updatePayload.address_text = data.address_text;
if (data.submitted_at !== undefined) updatePayload.submitted_at = data.submitted_at;
if (data.status !== undefined) updatePayload.status = data.status;
updatePayload.updatedById = currentUser.id;
await admission_applications.update(updatePayload, {transaction});
if (data.requested_grade !== undefined) {
await admission_applications.setRequested_grade(
data.requested_grade,
{ transaction }
);
}
if (data.requested_stream !== undefined) {
await admission_applications.setRequested_stream(
data.requested_stream,
{ transaction }
);
}
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.admission_applications.getTableName(),
belongsToColumn: 'documents',
belongsToId: admission_applications.id,
},
data.documents,
options,
);
return admission_applications;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const admission_applications = await db.admission_applications.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of admission_applications) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of admission_applications) {
await record.destroy({transaction});
}
});
return admission_applications;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const admission_applications = await db.admission_applications.findByPk(id, options);
await admission_applications.update({
deletedBy: currentUser.id
}, {
transaction,
});
await admission_applications.destroy({
transaction
});
return admission_applications;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const admission_applications = await db.admission_applications.findOne(
{ where },
{ transaction },
);
if (!admission_applications) {
return admission_applications;
}
const output = admission_applications.get({plain: true});
output.requested_grade = await admission_applications.getRequested_grade({
transaction
});
output.requested_stream = await admission_applications.getRequested_stream({
transaction
});
output.documents = await admission_applications.getDocuments({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.grades,
as: 'requested_grade',
where: filter.requested_grade ? {
[Op.or]: [
{ id: { [Op.in]: filter.requested_grade.split('|').map(term => Utils.uuid(term)) } },
{
label: {
[Op.or]: filter.requested_grade.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.streams,
as: 'requested_stream',
where: filter.requested_stream ? {
[Op.or]: [
{ id: { [Op.in]: filter.requested_stream.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.requested_stream.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.file,
as: 'documents',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.applicant_full_name) {
where = {
...where,
[Op.and]: Utils.ilike(
'admission_applications',
'applicant_full_name',
filter.applicant_full_name,
),
};
}
if (filter.previous_school) {
where = {
...where,
[Op.and]: Utils.ilike(
'admission_applications',
'previous_school',
filter.previous_school,
),
};
}
if (filter.guardian_full_name) {
where = {
...where,
[Op.and]: Utils.ilike(
'admission_applications',
'guardian_full_name',
filter.guardian_full_name,
),
};
}
if (filter.guardian_phone) {
where = {
...where,
[Op.and]: Utils.ilike(
'admission_applications',
'guardian_phone',
filter.guardian_phone,
),
};
}
if (filter.guardian_email) {
where = {
...where,
[Op.and]: Utils.ilike(
'admission_applications',
'guardian_email',
filter.guardian_email,
),
};
}
if (filter.address_text) {
where = {
...where,
[Op.and]: Utils.ilike(
'admission_applications',
'address_text',
filter.address_text,
),
};
}
if (filter.date_of_birthRange) {
const [start, end] = filter.date_of_birthRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
date_of_birth: {
...where.date_of_birth,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
date_of_birth: {
...where.date_of_birth,
[Op.lte]: end,
},
};
}
}
if (filter.submitted_atRange) {
const [start, end] = filter.submitted_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
submitted_at: {
...where.submitted_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
submitted_at: {
...where.submitted_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.entry_type) {
where = {
...where,
entry_type: filter.entry_type,
};
}
if (filter.gender) {
where = {
...where,
gender: filter.gender,
};
}
if (filter.status) {
where = {
...where,
status: filter.status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.admission_applications.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'admission_applications',
'applicant_full_name',
query,
),
],
};
}
const records = await db.admission_applications.findAll({
attributes: [ 'id', 'applicant_full_name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['applicant_full_name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.applicant_full_name,
}));
}
};

View File

@ -0,0 +1,591 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class AlumniDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const alumni = await db.alumni.create(
{
id: data.id || undefined,
full_name: data.full_name
||
null
,
graduation_year: data.graduation_year
||
null
,
current_role: data.current_role
||
null
,
organization: data.organization
||
null
,
bio: data.bio
||
null
,
phone_number: data.phone_number
||
null
,
email: data.email
||
null
,
notable: data.notable
||
false
,
published: data.published
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.alumni.getTableName(),
belongsToColumn: 'photo',
belongsToId: alumni.id,
},
data.photo,
options,
);
return alumni;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const alumniData = data.map((item, index) => ({
id: item.id || undefined,
full_name: item.full_name
||
null
,
graduation_year: item.graduation_year
||
null
,
current_role: item.current_role
||
null
,
organization: item.organization
||
null
,
bio: item.bio
||
null
,
phone_number: item.phone_number
||
null
,
email: item.email
||
null
,
notable: item.notable
||
false
,
published: item.published
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const alumni = await db.alumni.bulkCreate(alumniData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < alumni.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.alumni.getTableName(),
belongsToColumn: 'photo',
belongsToId: alumni[i].id,
},
data[i].photo,
options,
);
}
return alumni;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const alumni = await db.alumni.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.full_name !== undefined) updatePayload.full_name = data.full_name;
if (data.graduation_year !== undefined) updatePayload.graduation_year = data.graduation_year;
if (data.current_role !== undefined) updatePayload.current_role = data.current_role;
if (data.organization !== undefined) updatePayload.organization = data.organization;
if (data.bio !== undefined) updatePayload.bio = data.bio;
if (data.phone_number !== undefined) updatePayload.phone_number = data.phone_number;
if (data.email !== undefined) updatePayload.email = data.email;
if (data.notable !== undefined) updatePayload.notable = data.notable;
if (data.published !== undefined) updatePayload.published = data.published;
updatePayload.updatedById = currentUser.id;
await alumni.update(updatePayload, {transaction});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.alumni.getTableName(),
belongsToColumn: 'photo',
belongsToId: alumni.id,
},
data.photo,
options,
);
return alumni;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const alumni = await db.alumni.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of alumni) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of alumni) {
await record.destroy({transaction});
}
});
return alumni;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const alumni = await db.alumni.findByPk(id, options);
await alumni.update({
deletedBy: currentUser.id
}, {
transaction,
});
await alumni.destroy({
transaction
});
return alumni;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const alumni = await db.alumni.findOne(
{ where },
{ transaction },
);
if (!alumni) {
return alumni;
}
const output = alumni.get({plain: true});
output.photo = await alumni.getPhoto({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.file,
as: 'photo',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.full_name) {
where = {
...where,
[Op.and]: Utils.ilike(
'alumni',
'full_name',
filter.full_name,
),
};
}
if (filter.graduation_year) {
where = {
...where,
[Op.and]: Utils.ilike(
'alumni',
'graduation_year',
filter.graduation_year,
),
};
}
if (filter.current_role) {
where = {
...where,
[Op.and]: Utils.ilike(
'alumni',
'current_role',
filter.current_role,
),
};
}
if (filter.organization) {
where = {
...where,
[Op.and]: Utils.ilike(
'alumni',
'organization',
filter.organization,
),
};
}
if (filter.bio) {
where = {
...where,
[Op.and]: Utils.ilike(
'alumni',
'bio',
filter.bio,
),
};
}
if (filter.phone_number) {
where = {
...where,
[Op.and]: Utils.ilike(
'alumni',
'phone_number',
filter.phone_number,
),
};
}
if (filter.email) {
where = {
...where,
[Op.and]: Utils.ilike(
'alumni',
'email',
filter.email,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.notable) {
where = {
...where,
notable: filter.notable,
};
}
if (filter.published) {
where = {
...where,
published: filter.published,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.alumni.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'alumni',
'full_name',
query,
),
],
};
}
const records = await db.alumni.findAll({
attributes: [ 'id', 'full_name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['full_name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.full_name,
}));
}
};

View File

@ -0,0 +1,681 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class AnnouncementsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const announcements = await db.announcements.create(
{
id: data.id || undefined,
title_om: data.title_om
||
null
,
title_am: data.title_am
||
null
,
title_en: data.title_en
||
null
,
content_om: data.content_om
||
null
,
content_am: data.content_am
||
null
,
content_en: data.content_en
||
null
,
visibility: data.visibility
||
null
,
publish_from: data.publish_from
||
null
,
publish_until: data.publish_until
||
null
,
pinned: data.pinned
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.announcements.getTableName(),
belongsToColumn: 'images',
belongsToId: announcements.id,
},
data.images,
options,
);
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.announcements.getTableName(),
belongsToColumn: 'attachments',
belongsToId: announcements.id,
},
data.attachments,
options,
);
return announcements;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const announcementsData = data.map((item, index) => ({
id: item.id || undefined,
title_om: item.title_om
||
null
,
title_am: item.title_am
||
null
,
title_en: item.title_en
||
null
,
content_om: item.content_om
||
null
,
content_am: item.content_am
||
null
,
content_en: item.content_en
||
null
,
visibility: item.visibility
||
null
,
publish_from: item.publish_from
||
null
,
publish_until: item.publish_until
||
null
,
pinned: item.pinned
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const announcements = await db.announcements.bulkCreate(announcementsData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < announcements.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.announcements.getTableName(),
belongsToColumn: 'images',
belongsToId: announcements[i].id,
},
data[i].images,
options,
);
}
for (let i = 0; i < announcements.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.announcements.getTableName(),
belongsToColumn: 'attachments',
belongsToId: announcements[i].id,
},
data[i].attachments,
options,
);
}
return announcements;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const announcements = await db.announcements.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.title_om !== undefined) updatePayload.title_om = data.title_om;
if (data.title_am !== undefined) updatePayload.title_am = data.title_am;
if (data.title_en !== undefined) updatePayload.title_en = data.title_en;
if (data.content_om !== undefined) updatePayload.content_om = data.content_om;
if (data.content_am !== undefined) updatePayload.content_am = data.content_am;
if (data.content_en !== undefined) updatePayload.content_en = data.content_en;
if (data.visibility !== undefined) updatePayload.visibility = data.visibility;
if (data.publish_from !== undefined) updatePayload.publish_from = data.publish_from;
if (data.publish_until !== undefined) updatePayload.publish_until = data.publish_until;
if (data.pinned !== undefined) updatePayload.pinned = data.pinned;
updatePayload.updatedById = currentUser.id;
await announcements.update(updatePayload, {transaction});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.announcements.getTableName(),
belongsToColumn: 'images',
belongsToId: announcements.id,
},
data.images,
options,
);
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.announcements.getTableName(),
belongsToColumn: 'attachments',
belongsToId: announcements.id,
},
data.attachments,
options,
);
return announcements;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const announcements = await db.announcements.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of announcements) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of announcements) {
await record.destroy({transaction});
}
});
return announcements;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const announcements = await db.announcements.findByPk(id, options);
await announcements.update({
deletedBy: currentUser.id
}, {
transaction,
});
await announcements.destroy({
transaction
});
return announcements;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const announcements = await db.announcements.findOne(
{ where },
{ transaction },
);
if (!announcements) {
return announcements;
}
const output = announcements.get({plain: true});
output.images = await announcements.getImages({
transaction
});
output.attachments = await announcements.getAttachments({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.file,
as: 'images',
},
{
model: db.file,
as: 'attachments',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.title_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'announcements',
'title_om',
filter.title_om,
),
};
}
if (filter.title_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'announcements',
'title_am',
filter.title_am,
),
};
}
if (filter.title_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'announcements',
'title_en',
filter.title_en,
),
};
}
if (filter.content_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'announcements',
'content_om',
filter.content_om,
),
};
}
if (filter.content_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'announcements',
'content_am',
filter.content_am,
),
};
}
if (filter.content_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'announcements',
'content_en',
filter.content_en,
),
};
}
if (filter.publish_fromRange) {
const [start, end] = filter.publish_fromRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
publish_from: {
...where.publish_from,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
publish_from: {
...where.publish_from,
[Op.lte]: end,
},
};
}
}
if (filter.publish_untilRange) {
const [start, end] = filter.publish_untilRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
publish_until: {
...where.publish_until,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
publish_until: {
...where.publish_until,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.visibility) {
where = {
...where,
visibility: filter.visibility,
};
}
if (filter.pinned) {
where = {
...where,
pinned: filter.pinned,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.announcements.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'announcements',
'title_en',
query,
),
],
};
}
const records = await db.announcements.findAll({
attributes: [ 'id', 'title_en' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['title_en', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.title_en,
}));
}
};

View File

@ -0,0 +1,595 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Assignment_submissionsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const assignment_submissions = await db.assignment_submissions.create(
{
id: data.id || undefined,
comment: data.comment
||
null
,
submitted_at: data.submitted_at
||
null
,
status: data.status
||
null
,
grade_value: data.grade_value
||
null
,
teacher_feedback: data.teacher_feedback
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await assignment_submissions.setAssignment( data.assignment || null, {
transaction,
});
await assignment_submissions.setStudent( data.student || null, {
transaction,
});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.assignment_submissions.getTableName(),
belongsToColumn: 'files',
belongsToId: assignment_submissions.id,
},
data.files,
options,
);
return assignment_submissions;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const assignment_submissionsData = data.map((item, index) => ({
id: item.id || undefined,
comment: item.comment
||
null
,
submitted_at: item.submitted_at
||
null
,
status: item.status
||
null
,
grade_value: item.grade_value
||
null
,
teacher_feedback: item.teacher_feedback
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const assignment_submissions = await db.assignment_submissions.bulkCreate(assignment_submissionsData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < assignment_submissions.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.assignment_submissions.getTableName(),
belongsToColumn: 'files',
belongsToId: assignment_submissions[i].id,
},
data[i].files,
options,
);
}
return assignment_submissions;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const assignment_submissions = await db.assignment_submissions.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.comment !== undefined) updatePayload.comment = data.comment;
if (data.submitted_at !== undefined) updatePayload.submitted_at = data.submitted_at;
if (data.status !== undefined) updatePayload.status = data.status;
if (data.grade_value !== undefined) updatePayload.grade_value = data.grade_value;
if (data.teacher_feedback !== undefined) updatePayload.teacher_feedback = data.teacher_feedback;
updatePayload.updatedById = currentUser.id;
await assignment_submissions.update(updatePayload, {transaction});
if (data.assignment !== undefined) {
await assignment_submissions.setAssignment(
data.assignment,
{ transaction }
);
}
if (data.student !== undefined) {
await assignment_submissions.setStudent(
data.student,
{ transaction }
);
}
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.assignment_submissions.getTableName(),
belongsToColumn: 'files',
belongsToId: assignment_submissions.id,
},
data.files,
options,
);
return assignment_submissions;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const assignment_submissions = await db.assignment_submissions.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of assignment_submissions) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of assignment_submissions) {
await record.destroy({transaction});
}
});
return assignment_submissions;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const assignment_submissions = await db.assignment_submissions.findByPk(id, options);
await assignment_submissions.update({
deletedBy: currentUser.id
}, {
transaction,
});
await assignment_submissions.destroy({
transaction
});
return assignment_submissions;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const assignment_submissions = await db.assignment_submissions.findOne(
{ where },
{ transaction },
);
if (!assignment_submissions) {
return assignment_submissions;
}
const output = assignment_submissions.get({plain: true});
output.assignment = await assignment_submissions.getAssignment({
transaction
});
output.student = await assignment_submissions.getStudent({
transaction
});
output.files = await assignment_submissions.getFiles({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.assignments,
as: 'assignment',
where: filter.assignment ? {
[Op.or]: [
{ id: { [Op.in]: filter.assignment.split('|').map(term => Utils.uuid(term)) } },
{
title: {
[Op.or]: filter.assignment.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.students,
as: 'student',
where: filter.student ? {
[Op.or]: [
{ id: { [Op.in]: filter.student.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.student.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.file,
as: 'files',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.comment) {
where = {
...where,
[Op.and]: Utils.ilike(
'assignment_submissions',
'comment',
filter.comment,
),
};
}
if (filter.teacher_feedback) {
where = {
...where,
[Op.and]: Utils.ilike(
'assignment_submissions',
'teacher_feedback',
filter.teacher_feedback,
),
};
}
if (filter.submitted_atRange) {
const [start, end] = filter.submitted_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
submitted_at: {
...where.submitted_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
submitted_at: {
...where.submitted_at,
[Op.lte]: end,
},
};
}
}
if (filter.grade_valueRange) {
const [start, end] = filter.grade_valueRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
grade_value: {
...where.grade_value,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
grade_value: {
...where.grade_value,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.status) {
where = {
...where,
status: filter.status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.assignment_submissions.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'assignment_submissions',
'status',
query,
),
],
};
}
const records = await db.assignment_submissions.findAll({
attributes: [ 'id', 'status' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['status', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.status,
}));
}
};

View File

@ -0,0 +1,654 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class AssignmentsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const assignments = await db.assignments.create(
{
id: data.id || undefined,
title: data.title
||
null
,
instructions: data.instructions
||
null
,
assigned_at: data.assigned_at
||
null
,
due_at: data.due_at
||
null
,
visibility: data.visibility
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await assignments.setClass_section( data.class_section || null, {
transaction,
});
await assignments.setSubject( data.subject || null, {
transaction,
});
await assignments.setTeacher( data.teacher || null, {
transaction,
});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.assignments.getTableName(),
belongsToColumn: 'attachments',
belongsToId: assignments.id,
},
data.attachments,
options,
);
return assignments;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const assignmentsData = data.map((item, index) => ({
id: item.id || undefined,
title: item.title
||
null
,
instructions: item.instructions
||
null
,
assigned_at: item.assigned_at
||
null
,
due_at: item.due_at
||
null
,
visibility: item.visibility
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const assignments = await db.assignments.bulkCreate(assignmentsData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < assignments.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.assignments.getTableName(),
belongsToColumn: 'attachments',
belongsToId: assignments[i].id,
},
data[i].attachments,
options,
);
}
return assignments;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const assignments = await db.assignments.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.title !== undefined) updatePayload.title = data.title;
if (data.instructions !== undefined) updatePayload.instructions = data.instructions;
if (data.assigned_at !== undefined) updatePayload.assigned_at = data.assigned_at;
if (data.due_at !== undefined) updatePayload.due_at = data.due_at;
if (data.visibility !== undefined) updatePayload.visibility = data.visibility;
updatePayload.updatedById = currentUser.id;
await assignments.update(updatePayload, {transaction});
if (data.class_section !== undefined) {
await assignments.setClass_section(
data.class_section,
{ transaction }
);
}
if (data.subject !== undefined) {
await assignments.setSubject(
data.subject,
{ transaction }
);
}
if (data.teacher !== undefined) {
await assignments.setTeacher(
data.teacher,
{ transaction }
);
}
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.assignments.getTableName(),
belongsToColumn: 'attachments',
belongsToId: assignments.id,
},
data.attachments,
options,
);
return assignments;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const assignments = await db.assignments.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of assignments) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of assignments) {
await record.destroy({transaction});
}
});
return assignments;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const assignments = await db.assignments.findByPk(id, options);
await assignments.update({
deletedBy: currentUser.id
}, {
transaction,
});
await assignments.destroy({
transaction
});
return assignments;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const assignments = await db.assignments.findOne(
{ where },
{ transaction },
);
if (!assignments) {
return assignments;
}
const output = assignments.get({plain: true});
output.assignment_submissions_assignment = await assignments.getAssignment_submissions_assignment({
transaction
});
output.class_section = await assignments.getClass_section({
transaction
});
output.subject = await assignments.getSubject({
transaction
});
output.teacher = await assignments.getTeacher({
transaction
});
output.attachments = await assignments.getAttachments({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.class_sections,
as: 'class_section',
where: filter.class_section ? {
[Op.or]: [
{ id: { [Op.in]: filter.class_section.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.class_section.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.subjects,
as: 'subject',
where: filter.subject ? {
[Op.or]: [
{ id: { [Op.in]: filter.subject.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.subject.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.staff_members,
as: 'teacher',
where: filter.teacher ? {
[Op.or]: [
{ id: { [Op.in]: filter.teacher.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.teacher.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.file,
as: 'attachments',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.title) {
where = {
...where,
[Op.and]: Utils.ilike(
'assignments',
'title',
filter.title,
),
};
}
if (filter.instructions) {
where = {
...where,
[Op.and]: Utils.ilike(
'assignments',
'instructions',
filter.instructions,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
assigned_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
due_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.assigned_atRange) {
const [start, end] = filter.assigned_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
assigned_at: {
...where.assigned_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
assigned_at: {
...where.assigned_at,
[Op.lte]: end,
},
};
}
}
if (filter.due_atRange) {
const [start, end] = filter.due_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
due_at: {
...where.due_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
due_at: {
...where.due_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.visibility) {
where = {
...where,
visibility: filter.visibility,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.assignments.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'assignments',
'title',
query,
),
],
};
}
const records = await db.assignments.findAll({
attributes: [ 'id', 'title' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['title', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.title,
}));
}
};

View File

@ -0,0 +1,455 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Attendance_recordsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const attendance_records = await db.attendance_records.create(
{
id: data.id || undefined,
status: data.status
||
null
,
remark: data.remark
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await attendance_records.setAttendance_session( data.attendance_session || null, {
transaction,
});
await attendance_records.setStudent( data.student || null, {
transaction,
});
return attendance_records;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const attendance_recordsData = data.map((item, index) => ({
id: item.id || undefined,
status: item.status
||
null
,
remark: item.remark
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const attendance_records = await db.attendance_records.bulkCreate(attendance_recordsData, { transaction });
// For each item created, replace relation files
return attendance_records;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const attendance_records = await db.attendance_records.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.status !== undefined) updatePayload.status = data.status;
if (data.remark !== undefined) updatePayload.remark = data.remark;
updatePayload.updatedById = currentUser.id;
await attendance_records.update(updatePayload, {transaction});
if (data.attendance_session !== undefined) {
await attendance_records.setAttendance_session(
data.attendance_session,
{ transaction }
);
}
if (data.student !== undefined) {
await attendance_records.setStudent(
data.student,
{ transaction }
);
}
return attendance_records;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const attendance_records = await db.attendance_records.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of attendance_records) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of attendance_records) {
await record.destroy({transaction});
}
});
return attendance_records;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const attendance_records = await db.attendance_records.findByPk(id, options);
await attendance_records.update({
deletedBy: currentUser.id
}, {
transaction,
});
await attendance_records.destroy({
transaction
});
return attendance_records;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const attendance_records = await db.attendance_records.findOne(
{ where },
{ transaction },
);
if (!attendance_records) {
return attendance_records;
}
const output = attendance_records.get({plain: true});
output.attendance_session = await attendance_records.getAttendance_session({
transaction
});
output.student = await attendance_records.getStudent({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.attendance_sessions,
as: 'attendance_session',
where: filter.attendance_session ? {
[Op.or]: [
{ id: { [Op.in]: filter.attendance_session.split('|').map(term => Utils.uuid(term)) } },
{
note: {
[Op.or]: filter.attendance_session.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.students,
as: 'student',
where: filter.student ? {
[Op.or]: [
{ id: { [Op.in]: filter.student.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.student.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.remark) {
where = {
...where,
[Op.and]: Utils.ilike(
'attendance_records',
'remark',
filter.remark,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.status) {
where = {
...where,
status: filter.status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.attendance_records.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'attendance_records',
'status',
query,
),
],
};
}
const records = await db.attendance_records.findAll({
attributes: [ 'id', 'status' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['status', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.status,
}));
}
};

View File

@ -0,0 +1,533 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Attendance_sessionsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const attendance_sessions = await db.attendance_sessions.create(
{
id: data.id || undefined,
session_at: data.session_at
||
null
,
session_type: data.session_type
||
null
,
note: data.note
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await attendance_sessions.setClass_section( data.class_section || null, {
transaction,
});
await attendance_sessions.setSubject( data.subject || null, {
transaction,
});
await attendance_sessions.setTeacher( data.teacher || null, {
transaction,
});
return attendance_sessions;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const attendance_sessionsData = data.map((item, index) => ({
id: item.id || undefined,
session_at: item.session_at
||
null
,
session_type: item.session_type
||
null
,
note: item.note
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const attendance_sessions = await db.attendance_sessions.bulkCreate(attendance_sessionsData, { transaction });
// For each item created, replace relation files
return attendance_sessions;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const attendance_sessions = await db.attendance_sessions.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.session_at !== undefined) updatePayload.session_at = data.session_at;
if (data.session_type !== undefined) updatePayload.session_type = data.session_type;
if (data.note !== undefined) updatePayload.note = data.note;
updatePayload.updatedById = currentUser.id;
await attendance_sessions.update(updatePayload, {transaction});
if (data.class_section !== undefined) {
await attendance_sessions.setClass_section(
data.class_section,
{ transaction }
);
}
if (data.subject !== undefined) {
await attendance_sessions.setSubject(
data.subject,
{ transaction }
);
}
if (data.teacher !== undefined) {
await attendance_sessions.setTeacher(
data.teacher,
{ transaction }
);
}
return attendance_sessions;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const attendance_sessions = await db.attendance_sessions.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of attendance_sessions) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of attendance_sessions) {
await record.destroy({transaction});
}
});
return attendance_sessions;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const attendance_sessions = await db.attendance_sessions.findByPk(id, options);
await attendance_sessions.update({
deletedBy: currentUser.id
}, {
transaction,
});
await attendance_sessions.destroy({
transaction
});
return attendance_sessions;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const attendance_sessions = await db.attendance_sessions.findOne(
{ where },
{ transaction },
);
if (!attendance_sessions) {
return attendance_sessions;
}
const output = attendance_sessions.get({plain: true});
output.attendance_records_attendance_session = await attendance_sessions.getAttendance_records_attendance_session({
transaction
});
output.class_section = await attendance_sessions.getClass_section({
transaction
});
output.subject = await attendance_sessions.getSubject({
transaction
});
output.teacher = await attendance_sessions.getTeacher({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.class_sections,
as: 'class_section',
where: filter.class_section ? {
[Op.or]: [
{ id: { [Op.in]: filter.class_section.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.class_section.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.subjects,
as: 'subject',
where: filter.subject ? {
[Op.or]: [
{ id: { [Op.in]: filter.subject.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.subject.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.staff_members,
as: 'teacher',
where: filter.teacher ? {
[Op.or]: [
{ id: { [Op.in]: filter.teacher.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.teacher.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.note) {
where = {
...where,
[Op.and]: Utils.ilike(
'attendance_sessions',
'note',
filter.note,
),
};
}
if (filter.session_atRange) {
const [start, end] = filter.session_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
session_at: {
...where.session_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
session_at: {
...where.session_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.session_type) {
where = {
...where,
session_type: filter.session_type,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.attendance_sessions.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'attendance_sessions',
'note',
query,
),
],
};
}
const records = await db.attendance_sessions.findAll({
attributes: [ 'id', 'note' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['note', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.note,
}));
}
};

View File

@ -0,0 +1,608 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Class_schedulesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const class_schedules = await db.class_schedules.create(
{
id: data.id || undefined,
weekday: data.weekday
||
null
,
period_label: data.period_label
||
null
,
starts_at: data.starts_at
||
null
,
ends_at: data.ends_at
||
null
,
room: data.room
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await class_schedules.setClass_section( data.class_section || null, {
transaction,
});
await class_schedules.setSubject( data.subject || null, {
transaction,
});
await class_schedules.setTeacher( data.teacher || null, {
transaction,
});
return class_schedules;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const class_schedulesData = data.map((item, index) => ({
id: item.id || undefined,
weekday: item.weekday
||
null
,
period_label: item.period_label
||
null
,
starts_at: item.starts_at
||
null
,
ends_at: item.ends_at
||
null
,
room: item.room
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const class_schedules = await db.class_schedules.bulkCreate(class_schedulesData, { transaction });
// For each item created, replace relation files
return class_schedules;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const class_schedules = await db.class_schedules.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.weekday !== undefined) updatePayload.weekday = data.weekday;
if (data.period_label !== undefined) updatePayload.period_label = data.period_label;
if (data.starts_at !== undefined) updatePayload.starts_at = data.starts_at;
if (data.ends_at !== undefined) updatePayload.ends_at = data.ends_at;
if (data.room !== undefined) updatePayload.room = data.room;
updatePayload.updatedById = currentUser.id;
await class_schedules.update(updatePayload, {transaction});
if (data.class_section !== undefined) {
await class_schedules.setClass_section(
data.class_section,
{ transaction }
);
}
if (data.subject !== undefined) {
await class_schedules.setSubject(
data.subject,
{ transaction }
);
}
if (data.teacher !== undefined) {
await class_schedules.setTeacher(
data.teacher,
{ transaction }
);
}
return class_schedules;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const class_schedules = await db.class_schedules.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of class_schedules) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of class_schedules) {
await record.destroy({transaction});
}
});
return class_schedules;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const class_schedules = await db.class_schedules.findByPk(id, options);
await class_schedules.update({
deletedBy: currentUser.id
}, {
transaction,
});
await class_schedules.destroy({
transaction
});
return class_schedules;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const class_schedules = await db.class_schedules.findOne(
{ where },
{ transaction },
);
if (!class_schedules) {
return class_schedules;
}
const output = class_schedules.get({plain: true});
output.class_section = await class_schedules.getClass_section({
transaction
});
output.subject = await class_schedules.getSubject({
transaction
});
output.teacher = await class_schedules.getTeacher({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.class_sections,
as: 'class_section',
where: filter.class_section ? {
[Op.or]: [
{ id: { [Op.in]: filter.class_section.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.class_section.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.subjects,
as: 'subject',
where: filter.subject ? {
[Op.or]: [
{ id: { [Op.in]: filter.subject.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.subject.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.staff_members,
as: 'teacher',
where: filter.teacher ? {
[Op.or]: [
{ id: { [Op.in]: filter.teacher.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.teacher.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.period_label) {
where = {
...where,
[Op.and]: Utils.ilike(
'class_schedules',
'period_label',
filter.period_label,
),
};
}
if (filter.room) {
where = {
...where,
[Op.and]: Utils.ilike(
'class_schedules',
'room',
filter.room,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
starts_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
ends_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.starts_atRange) {
const [start, end] = filter.starts_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
starts_at: {
...where.starts_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
starts_at: {
...where.starts_at,
[Op.lte]: end,
},
};
}
}
if (filter.ends_atRange) {
const [start, end] = filter.ends_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
ends_at: {
...where.ends_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
ends_at: {
...where.ends_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.weekday) {
where = {
...where,
weekday: filter.weekday,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.class_schedules.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'class_schedules',
'period_label',
query,
),
],
};
}
const records = await db.class_schedules.findAll({
attributes: [ 'id', 'period_label' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['period_label', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.period_label,
}));
}
};

View File

@ -0,0 +1,566 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Class_sectionsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const class_sections = await db.class_sections.create(
{
id: data.id || undefined,
name: data.name
||
null
,
capacity: data.capacity
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await class_sections.setSchool_year( data.school_year || null, {
transaction,
});
await class_sections.setGrade( data.grade || null, {
transaction,
});
await class_sections.setStream( data.stream || null, {
transaction,
});
await class_sections.setHomeroom_teacher( data.homeroom_teacher || null, {
transaction,
});
return class_sections;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const class_sectionsData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
capacity: item.capacity
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const class_sections = await db.class_sections.bulkCreate(class_sectionsData, { transaction });
// For each item created, replace relation files
return class_sections;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const class_sections = await db.class_sections.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.capacity !== undefined) updatePayload.capacity = data.capacity;
updatePayload.updatedById = currentUser.id;
await class_sections.update(updatePayload, {transaction});
if (data.school_year !== undefined) {
await class_sections.setSchool_year(
data.school_year,
{ transaction }
);
}
if (data.grade !== undefined) {
await class_sections.setGrade(
data.grade,
{ transaction }
);
}
if (data.stream !== undefined) {
await class_sections.setStream(
data.stream,
{ transaction }
);
}
if (data.homeroom_teacher !== undefined) {
await class_sections.setHomeroom_teacher(
data.homeroom_teacher,
{ transaction }
);
}
return class_sections;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const class_sections = await db.class_sections.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of class_sections) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of class_sections) {
await record.destroy({transaction});
}
});
return class_sections;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const class_sections = await db.class_sections.findByPk(id, options);
await class_sections.update({
deletedBy: currentUser.id
}, {
transaction,
});
await class_sections.destroy({
transaction
});
return class_sections;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const class_sections = await db.class_sections.findOne(
{ where },
{ transaction },
);
if (!class_sections) {
return class_sections;
}
const output = class_sections.get({plain: true});
output.enrollments_class_section = await class_sections.getEnrollments_class_section({
transaction
});
output.attendance_sessions_class_section = await class_sections.getAttendance_sessions_class_section({
transaction
});
output.class_schedules_class_section = await class_sections.getClass_schedules_class_section({
transaction
});
output.assignments_class_section = await class_sections.getAssignments_class_section({
transaction
});
output.lesson_plans_class_section = await class_sections.getLesson_plans_class_section({
transaction
});
output.school_year = await class_sections.getSchool_year({
transaction
});
output.grade = await class_sections.getGrade({
transaction
});
output.stream = await class_sections.getStream({
transaction
});
output.homeroom_teacher = await class_sections.getHomeroom_teacher({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.school_years,
as: 'school_year',
where: filter.school_year ? {
[Op.or]: [
{ id: { [Op.in]: filter.school_year.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.school_year.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.grades,
as: 'grade',
where: filter.grade ? {
[Op.or]: [
{ id: { [Op.in]: filter.grade.split('|').map(term => Utils.uuid(term)) } },
{
label: {
[Op.or]: filter.grade.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.streams,
as: 'stream',
where: filter.stream ? {
[Op.or]: [
{ id: { [Op.in]: filter.stream.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.stream.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.staff_members,
as: 'homeroom_teacher',
where: filter.homeroom_teacher ? {
[Op.or]: [
{ id: { [Op.in]: filter.homeroom_teacher.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.homeroom_teacher.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'class_sections',
'name',
filter.name,
),
};
}
if (filter.capacityRange) {
const [start, end] = filter.capacityRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
capacity: {
...where.capacity,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
capacity: {
...where.capacity,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.class_sections.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'class_sections',
'name',
query,
),
],
};
}
const records = await db.class_sections.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,510 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Contact_messagesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const contact_messages = await db.contact_messages.create(
{
id: data.id || undefined,
full_name: data.full_name
||
null
,
phone_number: data.phone_number
||
null
,
email: data.email
||
null
,
topic: data.topic
||
null
,
message: data.message
||
null
,
submitted_at: data.submitted_at
||
null
,
status: data.status
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
return contact_messages;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const contact_messagesData = data.map((item, index) => ({
id: item.id || undefined,
full_name: item.full_name
||
null
,
phone_number: item.phone_number
||
null
,
email: item.email
||
null
,
topic: item.topic
||
null
,
message: item.message
||
null
,
submitted_at: item.submitted_at
||
null
,
status: item.status
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const contact_messages = await db.contact_messages.bulkCreate(contact_messagesData, { transaction });
// For each item created, replace relation files
return contact_messages;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const contact_messages = await db.contact_messages.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.full_name !== undefined) updatePayload.full_name = data.full_name;
if (data.phone_number !== undefined) updatePayload.phone_number = data.phone_number;
if (data.email !== undefined) updatePayload.email = data.email;
if (data.topic !== undefined) updatePayload.topic = data.topic;
if (data.message !== undefined) updatePayload.message = data.message;
if (data.submitted_at !== undefined) updatePayload.submitted_at = data.submitted_at;
if (data.status !== undefined) updatePayload.status = data.status;
updatePayload.updatedById = currentUser.id;
await contact_messages.update(updatePayload, {transaction});
return contact_messages;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const contact_messages = await db.contact_messages.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of contact_messages) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of contact_messages) {
await record.destroy({transaction});
}
});
return contact_messages;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const contact_messages = await db.contact_messages.findByPk(id, options);
await contact_messages.update({
deletedBy: currentUser.id
}, {
transaction,
});
await contact_messages.destroy({
transaction
});
return contact_messages;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const contact_messages = await db.contact_messages.findOne(
{ where },
{ transaction },
);
if (!contact_messages) {
return contact_messages;
}
const output = contact_messages.get({plain: true});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.full_name) {
where = {
...where,
[Op.and]: Utils.ilike(
'contact_messages',
'full_name',
filter.full_name,
),
};
}
if (filter.phone_number) {
where = {
...where,
[Op.and]: Utils.ilike(
'contact_messages',
'phone_number',
filter.phone_number,
),
};
}
if (filter.email) {
where = {
...where,
[Op.and]: Utils.ilike(
'contact_messages',
'email',
filter.email,
),
};
}
if (filter.message) {
where = {
...where,
[Op.and]: Utils.ilike(
'contact_messages',
'message',
filter.message,
),
};
}
if (filter.submitted_atRange) {
const [start, end] = filter.submitted_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
submitted_at: {
...where.submitted_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
submitted_at: {
...where.submitted_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.topic) {
where = {
...where,
topic: filter.topic,
};
}
if (filter.status) {
where = {
...where,
status: filter.status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.contact_messages.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'contact_messages',
'full_name',
query,
),
],
};
}
const records = await db.contact_messages.findAll({
attributes: [ 'id', 'full_name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['full_name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.full_name,
}));
}
};

View File

@ -0,0 +1,468 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class EnrollmentsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const enrollments = await db.enrollments.create(
{
id: data.id || undefined,
enrolled_at: data.enrolled_at
||
null
,
status: data.status
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await enrollments.setStudent( data.student || null, {
transaction,
});
await enrollments.setClass_section( data.class_section || null, {
transaction,
});
return enrollments;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const enrollmentsData = data.map((item, index) => ({
id: item.id || undefined,
enrolled_at: item.enrolled_at
||
null
,
status: item.status
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const enrollments = await db.enrollments.bulkCreate(enrollmentsData, { transaction });
// For each item created, replace relation files
return enrollments;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const enrollments = await db.enrollments.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.enrolled_at !== undefined) updatePayload.enrolled_at = data.enrolled_at;
if (data.status !== undefined) updatePayload.status = data.status;
updatePayload.updatedById = currentUser.id;
await enrollments.update(updatePayload, {transaction});
if (data.student !== undefined) {
await enrollments.setStudent(
data.student,
{ transaction }
);
}
if (data.class_section !== undefined) {
await enrollments.setClass_section(
data.class_section,
{ transaction }
);
}
return enrollments;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const enrollments = await db.enrollments.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of enrollments) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of enrollments) {
await record.destroy({transaction});
}
});
return enrollments;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const enrollments = await db.enrollments.findByPk(id, options);
await enrollments.update({
deletedBy: currentUser.id
}, {
transaction,
});
await enrollments.destroy({
transaction
});
return enrollments;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const enrollments = await db.enrollments.findOne(
{ where },
{ transaction },
);
if (!enrollments) {
return enrollments;
}
const output = enrollments.get({plain: true});
output.student = await enrollments.getStudent({
transaction
});
output.class_section = await enrollments.getClass_section({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.students,
as: 'student',
where: filter.student ? {
[Op.or]: [
{ id: { [Op.in]: filter.student.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.student.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.class_sections,
as: 'class_section',
where: filter.class_section ? {
[Op.or]: [
{ id: { [Op.in]: filter.class_section.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.class_section.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.enrolled_atRange) {
const [start, end] = filter.enrolled_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
enrolled_at: {
...where.enrolled_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
enrolled_at: {
...where.enrolled_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.status) {
where = {
...where,
status: filter.status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.enrollments.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'enrollments',
'status',
query,
),
],
};
}
const records = await db.enrollments.findAll({
attributes: [ 'id', 'status' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['status', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.status,
}));
}
};

View File

@ -0,0 +1,681 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class EventsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const events = await db.events.create(
{
id: data.id || undefined,
title_om: data.title_om
||
null
,
title_am: data.title_am
||
null
,
title_en: data.title_en
||
null
,
description_om: data.description_om
||
null
,
description_am: data.description_am
||
null
,
description_en: data.description_en
||
null
,
starts_at: data.starts_at
||
null
,
ends_at: data.ends_at
||
null
,
location_text: data.location_text
||
null
,
visibility: data.visibility
||
null
,
published: data.published
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.events.getTableName(),
belongsToColumn: 'cover_image',
belongsToId: events.id,
},
data.cover_image,
options,
);
return events;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const eventsData = data.map((item, index) => ({
id: item.id || undefined,
title_om: item.title_om
||
null
,
title_am: item.title_am
||
null
,
title_en: item.title_en
||
null
,
description_om: item.description_om
||
null
,
description_am: item.description_am
||
null
,
description_en: item.description_en
||
null
,
starts_at: item.starts_at
||
null
,
ends_at: item.ends_at
||
null
,
location_text: item.location_text
||
null
,
visibility: item.visibility
||
null
,
published: item.published
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const events = await db.events.bulkCreate(eventsData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < events.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.events.getTableName(),
belongsToColumn: 'cover_image',
belongsToId: events[i].id,
},
data[i].cover_image,
options,
);
}
return events;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const events = await db.events.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.title_om !== undefined) updatePayload.title_om = data.title_om;
if (data.title_am !== undefined) updatePayload.title_am = data.title_am;
if (data.title_en !== undefined) updatePayload.title_en = data.title_en;
if (data.description_om !== undefined) updatePayload.description_om = data.description_om;
if (data.description_am !== undefined) updatePayload.description_am = data.description_am;
if (data.description_en !== undefined) updatePayload.description_en = data.description_en;
if (data.starts_at !== undefined) updatePayload.starts_at = data.starts_at;
if (data.ends_at !== undefined) updatePayload.ends_at = data.ends_at;
if (data.location_text !== undefined) updatePayload.location_text = data.location_text;
if (data.visibility !== undefined) updatePayload.visibility = data.visibility;
if (data.published !== undefined) updatePayload.published = data.published;
updatePayload.updatedById = currentUser.id;
await events.update(updatePayload, {transaction});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.events.getTableName(),
belongsToColumn: 'cover_image',
belongsToId: events.id,
},
data.cover_image,
options,
);
return events;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const events = await db.events.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of events) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of events) {
await record.destroy({transaction});
}
});
return events;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const events = await db.events.findByPk(id, options);
await events.update({
deletedBy: currentUser.id
}, {
transaction,
});
await events.destroy({
transaction
});
return events;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const events = await db.events.findOne(
{ where },
{ transaction },
);
if (!events) {
return events;
}
const output = events.get({plain: true});
output.cover_image = await events.getCover_image({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.file,
as: 'cover_image',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.title_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'events',
'title_om',
filter.title_om,
),
};
}
if (filter.title_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'events',
'title_am',
filter.title_am,
),
};
}
if (filter.title_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'events',
'title_en',
filter.title_en,
),
};
}
if (filter.description_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'events',
'description_om',
filter.description_om,
),
};
}
if (filter.description_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'events',
'description_am',
filter.description_am,
),
};
}
if (filter.description_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'events',
'description_en',
filter.description_en,
),
};
}
if (filter.location_text) {
where = {
...where,
[Op.and]: Utils.ilike(
'events',
'location_text',
filter.location_text,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
starts_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
ends_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.starts_atRange) {
const [start, end] = filter.starts_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
starts_at: {
...where.starts_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
starts_at: {
...where.starts_at,
[Op.lte]: end,
},
};
}
}
if (filter.ends_atRange) {
const [start, end] = filter.ends_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
ends_at: {
...where.ends_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
ends_at: {
...where.ends_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.visibility) {
where = {
...where,
visibility: filter.visibility,
};
}
if (filter.published) {
where = {
...where,
published: filter.published,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.events.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'events',
'title_en',
query,
),
],
};
}
const records = await db.events.findAll({
attributes: [ 'id', 'title_en' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['title_en', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.title_en,
}));
}
};

View File

@ -0,0 +1,581 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Exam_performance_summariesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const exam_performance_summaries = await db.exam_performance_summaries.create(
{
id: data.id || undefined,
average_score: data.average_score
||
null
,
highest_score: data.highest_score
||
null
,
lowest_score: data.lowest_score
||
null
,
pass_rate: data.pass_rate
||
null
,
student_count: data.student_count
||
null
,
published: data.published
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await exam_performance_summaries.setExam( data.exam || null, {
transaction,
});
return exam_performance_summaries;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const exam_performance_summariesData = data.map((item, index) => ({
id: item.id || undefined,
average_score: item.average_score
||
null
,
highest_score: item.highest_score
||
null
,
lowest_score: item.lowest_score
||
null
,
pass_rate: item.pass_rate
||
null
,
student_count: item.student_count
||
null
,
published: item.published
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const exam_performance_summaries = await db.exam_performance_summaries.bulkCreate(exam_performance_summariesData, { transaction });
// For each item created, replace relation files
return exam_performance_summaries;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const exam_performance_summaries = await db.exam_performance_summaries.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.average_score !== undefined) updatePayload.average_score = data.average_score;
if (data.highest_score !== undefined) updatePayload.highest_score = data.highest_score;
if (data.lowest_score !== undefined) updatePayload.lowest_score = data.lowest_score;
if (data.pass_rate !== undefined) updatePayload.pass_rate = data.pass_rate;
if (data.student_count !== undefined) updatePayload.student_count = data.student_count;
if (data.published !== undefined) updatePayload.published = data.published;
updatePayload.updatedById = currentUser.id;
await exam_performance_summaries.update(updatePayload, {transaction});
if (data.exam !== undefined) {
await exam_performance_summaries.setExam(
data.exam,
{ transaction }
);
}
return exam_performance_summaries;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const exam_performance_summaries = await db.exam_performance_summaries.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of exam_performance_summaries) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of exam_performance_summaries) {
await record.destroy({transaction});
}
});
return exam_performance_summaries;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const exam_performance_summaries = await db.exam_performance_summaries.findByPk(id, options);
await exam_performance_summaries.update({
deletedBy: currentUser.id
}, {
transaction,
});
await exam_performance_summaries.destroy({
transaction
});
return exam_performance_summaries;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const exam_performance_summaries = await db.exam_performance_summaries.findOne(
{ where },
{ transaction },
);
if (!exam_performance_summaries) {
return exam_performance_summaries;
}
const output = exam_performance_summaries.get({plain: true});
output.exam = await exam_performance_summaries.getExam({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.exams,
as: 'exam',
where: filter.exam ? {
[Op.or]: [
{ id: { [Op.in]: filter.exam.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.exam.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.average_scoreRange) {
const [start, end] = filter.average_scoreRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
average_score: {
...where.average_score,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
average_score: {
...where.average_score,
[Op.lte]: end,
},
};
}
}
if (filter.highest_scoreRange) {
const [start, end] = filter.highest_scoreRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
highest_score: {
...where.highest_score,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
highest_score: {
...where.highest_score,
[Op.lte]: end,
},
};
}
}
if (filter.lowest_scoreRange) {
const [start, end] = filter.lowest_scoreRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
lowest_score: {
...where.lowest_score,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
lowest_score: {
...where.lowest_score,
[Op.lte]: end,
},
};
}
}
if (filter.pass_rateRange) {
const [start, end] = filter.pass_rateRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
pass_rate: {
...where.pass_rate,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
pass_rate: {
...where.pass_rate,
[Op.lte]: end,
},
};
}
}
if (filter.student_countRange) {
const [start, end] = filter.student_countRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
student_count: {
...where.student_count,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
student_count: {
...where.student_count,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.published) {
where = {
...where,
published: filter.published,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.exam_performance_summaries.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'exam_performance_summaries',
'published',
query,
),
],
};
}
const records = await db.exam_performance_summaries.findAll({
attributes: [ 'id', 'published' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['published', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.published,
}));
}
};

View File

@ -0,0 +1,616 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Exam_resultsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const exam_results = await db.exam_results.create(
{
id: data.id || undefined,
score: data.score
||
null
,
out_of: data.out_of
||
null
,
result_status: data.result_status
||
null
,
entered_at: data.entered_at
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await exam_results.setExam( data.exam || null, {
transaction,
});
await exam_results.setStudent( data.student || null, {
transaction,
});
await exam_results.setSubject( data.subject || null, {
transaction,
});
await exam_results.setEntered_by( data.entered_by || null, {
transaction,
});
return exam_results;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const exam_resultsData = data.map((item, index) => ({
id: item.id || undefined,
score: item.score
||
null
,
out_of: item.out_of
||
null
,
result_status: item.result_status
||
null
,
entered_at: item.entered_at
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const exam_results = await db.exam_results.bulkCreate(exam_resultsData, { transaction });
// For each item created, replace relation files
return exam_results;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const exam_results = await db.exam_results.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.score !== undefined) updatePayload.score = data.score;
if (data.out_of !== undefined) updatePayload.out_of = data.out_of;
if (data.result_status !== undefined) updatePayload.result_status = data.result_status;
if (data.entered_at !== undefined) updatePayload.entered_at = data.entered_at;
updatePayload.updatedById = currentUser.id;
await exam_results.update(updatePayload, {transaction});
if (data.exam !== undefined) {
await exam_results.setExam(
data.exam,
{ transaction }
);
}
if (data.student !== undefined) {
await exam_results.setStudent(
data.student,
{ transaction }
);
}
if (data.subject !== undefined) {
await exam_results.setSubject(
data.subject,
{ transaction }
);
}
if (data.entered_by !== undefined) {
await exam_results.setEntered_by(
data.entered_by,
{ transaction }
);
}
return exam_results;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const exam_results = await db.exam_results.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of exam_results) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of exam_results) {
await record.destroy({transaction});
}
});
return exam_results;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const exam_results = await db.exam_results.findByPk(id, options);
await exam_results.update({
deletedBy: currentUser.id
}, {
transaction,
});
await exam_results.destroy({
transaction
});
return exam_results;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const exam_results = await db.exam_results.findOne(
{ where },
{ transaction },
);
if (!exam_results) {
return exam_results;
}
const output = exam_results.get({plain: true});
output.exam = await exam_results.getExam({
transaction
});
output.student = await exam_results.getStudent({
transaction
});
output.subject = await exam_results.getSubject({
transaction
});
output.entered_by = await exam_results.getEntered_by({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.exams,
as: 'exam',
where: filter.exam ? {
[Op.or]: [
{ id: { [Op.in]: filter.exam.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.exam.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.students,
as: 'student',
where: filter.student ? {
[Op.or]: [
{ id: { [Op.in]: filter.student.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.student.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.subjects,
as: 'subject',
where: filter.subject ? {
[Op.or]: [
{ id: { [Op.in]: filter.subject.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.subject.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.staff_members,
as: 'entered_by',
where: filter.entered_by ? {
[Op.or]: [
{ id: { [Op.in]: filter.entered_by.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.entered_by.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.scoreRange) {
const [start, end] = filter.scoreRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
score: {
...where.score,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
score: {
...where.score,
[Op.lte]: end,
},
};
}
}
if (filter.out_ofRange) {
const [start, end] = filter.out_ofRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
out_of: {
...where.out_of,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
out_of: {
...where.out_of,
[Op.lte]: end,
},
};
}
}
if (filter.entered_atRange) {
const [start, end] = filter.entered_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
entered_at: {
...where.entered_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
entered_at: {
...where.entered_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.result_status) {
where = {
...where,
result_status: filter.result_status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.exam_results.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'exam_results',
'result_status',
query,
),
],
};
}
const records = await db.exam_results.findAll({
attributes: [ 'id', 'result_status' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['result_status', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.result_status,
}));
}
};

655
backend/src/db/api/exams.js Normal file
View File

@ -0,0 +1,655 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class ExamsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const exams = await db.exams.create(
{
id: data.id || undefined,
name: data.name
||
null
,
exam_type: data.exam_type
||
null
,
starts_at: data.starts_at
||
null
,
ends_at: data.ends_at
||
null
,
public_results: data.public_results
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await exams.setSchool_year( data.school_year || null, {
transaction,
});
await exams.setTerm( data.term || null, {
transaction,
});
await exams.setGrade( data.grade || null, {
transaction,
});
await exams.setStream( data.stream || null, {
transaction,
});
return exams;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const examsData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
exam_type: item.exam_type
||
null
,
starts_at: item.starts_at
||
null
,
ends_at: item.ends_at
||
null
,
public_results: item.public_results
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const exams = await db.exams.bulkCreate(examsData, { transaction });
// For each item created, replace relation files
return exams;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const exams = await db.exams.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.exam_type !== undefined) updatePayload.exam_type = data.exam_type;
if (data.starts_at !== undefined) updatePayload.starts_at = data.starts_at;
if (data.ends_at !== undefined) updatePayload.ends_at = data.ends_at;
if (data.public_results !== undefined) updatePayload.public_results = data.public_results;
updatePayload.updatedById = currentUser.id;
await exams.update(updatePayload, {transaction});
if (data.school_year !== undefined) {
await exams.setSchool_year(
data.school_year,
{ transaction }
);
}
if (data.term !== undefined) {
await exams.setTerm(
data.term,
{ transaction }
);
}
if (data.grade !== undefined) {
await exams.setGrade(
data.grade,
{ transaction }
);
}
if (data.stream !== undefined) {
await exams.setStream(
data.stream,
{ transaction }
);
}
return exams;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const exams = await db.exams.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of exams) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of exams) {
await record.destroy({transaction});
}
});
return exams;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const exams = await db.exams.findByPk(id, options);
await exams.update({
deletedBy: currentUser.id
}, {
transaction,
});
await exams.destroy({
transaction
});
return exams;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const exams = await db.exams.findOne(
{ where },
{ transaction },
);
if (!exams) {
return exams;
}
const output = exams.get({plain: true});
output.exam_results_exam = await exams.getExam_results_exam({
transaction
});
output.exam_performance_summaries_exam = await exams.getExam_performance_summaries_exam({
transaction
});
output.top_student_features_exam = await exams.getTop_student_features_exam({
transaction
});
output.school_year = await exams.getSchool_year({
transaction
});
output.term = await exams.getTerm({
transaction
});
output.grade = await exams.getGrade({
transaction
});
output.stream = await exams.getStream({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.school_years,
as: 'school_year',
where: filter.school_year ? {
[Op.or]: [
{ id: { [Op.in]: filter.school_year.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.school_year.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.terms,
as: 'term',
where: filter.term ? {
[Op.or]: [
{ id: { [Op.in]: filter.term.split('|').map(term => Utils.uuid(term)) } },
{
term_name: {
[Op.or]: filter.term.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.grades,
as: 'grade',
where: filter.grade ? {
[Op.or]: [
{ id: { [Op.in]: filter.grade.split('|').map(term => Utils.uuid(term)) } },
{
label: {
[Op.or]: filter.grade.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.streams,
as: 'stream',
where: filter.stream ? {
[Op.or]: [
{ id: { [Op.in]: filter.stream.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.stream.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'exams',
'name',
filter.name,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
starts_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
ends_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.starts_atRange) {
const [start, end] = filter.starts_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
starts_at: {
...where.starts_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
starts_at: {
...where.starts_at,
[Op.lte]: end,
},
};
}
}
if (filter.ends_atRange) {
const [start, end] = filter.ends_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
ends_at: {
...where.ends_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
ends_at: {
...where.ends_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.exam_type) {
where = {
...where,
exam_type: filter.exam_type,
};
}
if (filter.public_results) {
where = {
...where,
public_results: filter.public_results,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.exams.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'exams',
'name',
query,
),
],
};
}
const records = await db.exams.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,87 @@
const db = require('../models');
const assert = require('assert');
const services = require('../../services/file');
module.exports = class FileDBApi {
static async replaceRelationFiles(
relation,
rawFiles,
options,
) {
assert(relation.belongsTo, 'belongsTo is required');
assert(
relation.belongsToColumn,
'belongsToColumn is required',
);
assert(relation.belongsToId, 'belongsToId is required');
let files = [];
if (Array.isArray(rawFiles)) {
files = rawFiles;
} else {
files = rawFiles ? [rawFiles] : [];
}
await this._removeLegacyFiles(relation, files, options);
await this._addFiles(relation, files, options);
}
static async _addFiles(relation, files, options) {
const transaction = (options && options.transaction) || undefined;
const currentUser = (options && options.currentUser) || {id: null};
const inexistentFiles = files.filter(
(file) => !!file.new,
);
for (const file of inexistentFiles) {
await db.file.create(
{
belongsTo: relation.belongsTo,
belongsToColumn: relation.belongsToColumn,
belongsToId: relation.belongsToId,
name: file.name,
sizeInBytes: file.sizeInBytes,
privateUrl: file.privateUrl,
publicUrl: file.publicUrl,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{
transaction,
},
);
}
}
static async _removeLegacyFiles(
relation,
files,
options,
) {
const transaction = (options && options.transaction) || undefined;
const filesToDelete = await db.file.findAll({
where: {
belongsTo: relation.belongsTo,
belongsToId: relation.belongsToId,
belongsToColumn: relation.belongsToColumn,
id: {
[db.Sequelize.Op
.notIn]: files
.filter((file) => !file.new)
.map((file) => file.id)
},
},
transaction,
});
for (let file of filesToDelete) {
await services.deleteGCloud(file.privateUrl);
await file.destroy({
transaction,
});
}
}
};

View File

@ -0,0 +1,582 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class GalleriesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const galleries = await db.galleries.create(
{
id: data.id || undefined,
title_om: data.title_om
||
null
,
title_am: data.title_am
||
null
,
title_en: data.title_en
||
null
,
description_om: data.description_om
||
null
,
description_am: data.description_am
||
null
,
description_en: data.description_en
||
null
,
published: data.published
||
false
,
published_at: data.published_at
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.galleries.getTableName(),
belongsToColumn: 'images',
belongsToId: galleries.id,
},
data.images,
options,
);
return galleries;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const galleriesData = data.map((item, index) => ({
id: item.id || undefined,
title_om: item.title_om
||
null
,
title_am: item.title_am
||
null
,
title_en: item.title_en
||
null
,
description_om: item.description_om
||
null
,
description_am: item.description_am
||
null
,
description_en: item.description_en
||
null
,
published: item.published
||
false
,
published_at: item.published_at
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const galleries = await db.galleries.bulkCreate(galleriesData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < galleries.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.galleries.getTableName(),
belongsToColumn: 'images',
belongsToId: galleries[i].id,
},
data[i].images,
options,
);
}
return galleries;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const galleries = await db.galleries.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.title_om !== undefined) updatePayload.title_om = data.title_om;
if (data.title_am !== undefined) updatePayload.title_am = data.title_am;
if (data.title_en !== undefined) updatePayload.title_en = data.title_en;
if (data.description_om !== undefined) updatePayload.description_om = data.description_om;
if (data.description_am !== undefined) updatePayload.description_am = data.description_am;
if (data.description_en !== undefined) updatePayload.description_en = data.description_en;
if (data.published !== undefined) updatePayload.published = data.published;
if (data.published_at !== undefined) updatePayload.published_at = data.published_at;
updatePayload.updatedById = currentUser.id;
await galleries.update(updatePayload, {transaction});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.galleries.getTableName(),
belongsToColumn: 'images',
belongsToId: galleries.id,
},
data.images,
options,
);
return galleries;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const galleries = await db.galleries.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of galleries) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of galleries) {
await record.destroy({transaction});
}
});
return galleries;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const galleries = await db.galleries.findByPk(id, options);
await galleries.update({
deletedBy: currentUser.id
}, {
transaction,
});
await galleries.destroy({
transaction
});
return galleries;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const galleries = await db.galleries.findOne(
{ where },
{ transaction },
);
if (!galleries) {
return galleries;
}
const output = galleries.get({plain: true});
output.images = await galleries.getImages({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.file,
as: 'images',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.title_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'galleries',
'title_om',
filter.title_om,
),
};
}
if (filter.title_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'galleries',
'title_am',
filter.title_am,
),
};
}
if (filter.title_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'galleries',
'title_en',
filter.title_en,
),
};
}
if (filter.description_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'galleries',
'description_om',
filter.description_om,
),
};
}
if (filter.description_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'galleries',
'description_am',
filter.description_am,
),
};
}
if (filter.description_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'galleries',
'description_en',
filter.description_en,
),
};
}
if (filter.published_atRange) {
const [start, end] = filter.published_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
published_at: {
...where.published_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
published_at: {
...where.published_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.published) {
where = {
...where,
published: filter.published,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.galleries.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'galleries',
'title_en',
query,
),
],
};
}
const records = await db.galleries.findAll({
attributes: [ 'id', 'title_en' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['title_en', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.title_en,
}));
}
};

View File

@ -0,0 +1,442 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class GradesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const grades = await db.grades.create(
{
id: data.id || undefined,
grade_number: data.grade_number
||
null
,
level: data.level
||
null
,
label: data.label
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
return grades;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const gradesData = data.map((item, index) => ({
id: item.id || undefined,
grade_number: item.grade_number
||
null
,
level: item.level
||
null
,
label: item.label
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const grades = await db.grades.bulkCreate(gradesData, { transaction });
// For each item created, replace relation files
return grades;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const grades = await db.grades.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.grade_number !== undefined) updatePayload.grade_number = data.grade_number;
if (data.level !== undefined) updatePayload.level = data.level;
if (data.label !== undefined) updatePayload.label = data.label;
updatePayload.updatedById = currentUser.id;
await grades.update(updatePayload, {transaction});
return grades;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const grades = await db.grades.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of grades) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of grades) {
await record.destroy({transaction});
}
});
return grades;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const grades = await db.grades.findByPk(id, options);
await grades.update({
deletedBy: currentUser.id
}, {
transaction,
});
await grades.destroy({
transaction
});
return grades;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const grades = await db.grades.findOne(
{ where },
{ transaction },
);
if (!grades) {
return grades;
}
const output = grades.get({plain: true});
output.subject_offerings_grade = await grades.getSubject_offerings_grade({
transaction
});
output.students_current_grade = await grades.getStudents_current_grade({
transaction
});
output.class_sections_grade = await grades.getClass_sections_grade({
transaction
});
output.study_materials_grade = await grades.getStudy_materials_grade({
transaction
});
output.exams_grade = await grades.getExams_grade({
transaction
});
output.admission_applications_requested_grade = await grades.getAdmission_applications_requested_grade({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.label) {
where = {
...where,
[Op.and]: Utils.ilike(
'grades',
'label',
filter.label,
),
};
}
if (filter.grade_numberRange) {
const [start, end] = filter.grade_numberRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
grade_number: {
...where.grade_number,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
grade_number: {
...where.grade_number,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.level) {
where = {
...where,
level: filter.level,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.grades.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'grades',
'label',
query,
),
],
};
}
const records = await db.grades.findAll({
attributes: [ 'id', 'label' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['label', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.label,
}));
}
};

View File

@ -0,0 +1,626 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Lesson_plansDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const lesson_plans = await db.lesson_plans.create(
{
id: data.id || undefined,
week_start_at: data.week_start_at
||
null
,
week_end_at: data.week_end_at
||
null
,
plan_content: data.plan_content
||
null
,
status: data.status
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await lesson_plans.setTeacher( data.teacher || null, {
transaction,
});
await lesson_plans.setClass_section( data.class_section || null, {
transaction,
});
await lesson_plans.setSubject( data.subject || null, {
transaction,
});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.lesson_plans.getTableName(),
belongsToColumn: 'attachments',
belongsToId: lesson_plans.id,
},
data.attachments,
options,
);
return lesson_plans;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const lesson_plansData = data.map((item, index) => ({
id: item.id || undefined,
week_start_at: item.week_start_at
||
null
,
week_end_at: item.week_end_at
||
null
,
plan_content: item.plan_content
||
null
,
status: item.status
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const lesson_plans = await db.lesson_plans.bulkCreate(lesson_plansData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < lesson_plans.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.lesson_plans.getTableName(),
belongsToColumn: 'attachments',
belongsToId: lesson_plans[i].id,
},
data[i].attachments,
options,
);
}
return lesson_plans;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const lesson_plans = await db.lesson_plans.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.week_start_at !== undefined) updatePayload.week_start_at = data.week_start_at;
if (data.week_end_at !== undefined) updatePayload.week_end_at = data.week_end_at;
if (data.plan_content !== undefined) updatePayload.plan_content = data.plan_content;
if (data.status !== undefined) updatePayload.status = data.status;
updatePayload.updatedById = currentUser.id;
await lesson_plans.update(updatePayload, {transaction});
if (data.teacher !== undefined) {
await lesson_plans.setTeacher(
data.teacher,
{ transaction }
);
}
if (data.class_section !== undefined) {
await lesson_plans.setClass_section(
data.class_section,
{ transaction }
);
}
if (data.subject !== undefined) {
await lesson_plans.setSubject(
data.subject,
{ transaction }
);
}
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.lesson_plans.getTableName(),
belongsToColumn: 'attachments',
belongsToId: lesson_plans.id,
},
data.attachments,
options,
);
return lesson_plans;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const lesson_plans = await db.lesson_plans.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of lesson_plans) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of lesson_plans) {
await record.destroy({transaction});
}
});
return lesson_plans;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const lesson_plans = await db.lesson_plans.findByPk(id, options);
await lesson_plans.update({
deletedBy: currentUser.id
}, {
transaction,
});
await lesson_plans.destroy({
transaction
});
return lesson_plans;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const lesson_plans = await db.lesson_plans.findOne(
{ where },
{ transaction },
);
if (!lesson_plans) {
return lesson_plans;
}
const output = lesson_plans.get({plain: true});
output.teacher = await lesson_plans.getTeacher({
transaction
});
output.class_section = await lesson_plans.getClass_section({
transaction
});
output.subject = await lesson_plans.getSubject({
transaction
});
output.attachments = await lesson_plans.getAttachments({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.staff_members,
as: 'teacher',
where: filter.teacher ? {
[Op.or]: [
{ id: { [Op.in]: filter.teacher.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.teacher.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.class_sections,
as: 'class_section',
where: filter.class_section ? {
[Op.or]: [
{ id: { [Op.in]: filter.class_section.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.class_section.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.subjects,
as: 'subject',
where: filter.subject ? {
[Op.or]: [
{ id: { [Op.in]: filter.subject.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.subject.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.file,
as: 'attachments',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.plan_content) {
where = {
...where,
[Op.and]: Utils.ilike(
'lesson_plans',
'plan_content',
filter.plan_content,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
week_start_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
week_end_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.week_start_atRange) {
const [start, end] = filter.week_start_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
week_start_at: {
...where.week_start_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
week_start_at: {
...where.week_start_at,
[Op.lte]: end,
},
};
}
}
if (filter.week_end_atRange) {
const [start, end] = filter.week_end_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
week_end_at: {
...where.week_end_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
week_end_at: {
...where.week_end_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.status) {
where = {
...where,
status: filter.status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.lesson_plans.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'lesson_plans',
'status',
query,
),
],
};
}
const records = await db.lesson_plans.findAll({
attributes: [ 'id', 'status' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['status', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.status,
}));
}
};

View File

@ -0,0 +1,553 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class MessagesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const messages = await db.messages.create(
{
id: data.id || undefined,
subject: data.subject
||
null
,
body: data.body
||
null
,
status: data.status
||
null
,
sent_at: data.sent_at
||
null
,
read_at: data.read_at
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await messages.setSender( data.sender || null, {
transaction,
});
await messages.setRecipient( data.recipient || null, {
transaction,
});
return messages;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const messagesData = data.map((item, index) => ({
id: item.id || undefined,
subject: item.subject
||
null
,
body: item.body
||
null
,
status: item.status
||
null
,
sent_at: item.sent_at
||
null
,
read_at: item.read_at
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const messages = await db.messages.bulkCreate(messagesData, { transaction });
// For each item created, replace relation files
return messages;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const messages = await db.messages.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.subject !== undefined) updatePayload.subject = data.subject;
if (data.body !== undefined) updatePayload.body = data.body;
if (data.status !== undefined) updatePayload.status = data.status;
if (data.sent_at !== undefined) updatePayload.sent_at = data.sent_at;
if (data.read_at !== undefined) updatePayload.read_at = data.read_at;
updatePayload.updatedById = currentUser.id;
await messages.update(updatePayload, {transaction});
if (data.sender !== undefined) {
await messages.setSender(
data.sender,
{ transaction }
);
}
if (data.recipient !== undefined) {
await messages.setRecipient(
data.recipient,
{ transaction }
);
}
return messages;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const messages = await db.messages.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of messages) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of messages) {
await record.destroy({transaction});
}
});
return messages;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const messages = await db.messages.findByPk(id, options);
await messages.update({
deletedBy: currentUser.id
}, {
transaction,
});
await messages.destroy({
transaction
});
return messages;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const messages = await db.messages.findOne(
{ where },
{ transaction },
);
if (!messages) {
return messages;
}
const output = messages.get({plain: true});
output.sender = await messages.getSender({
transaction
});
output.recipient = await messages.getRecipient({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.users,
as: 'sender',
where: filter.sender ? {
[Op.or]: [
{ id: { [Op.in]: filter.sender.split('|').map(term => Utils.uuid(term)) } },
{
firstName: {
[Op.or]: filter.sender.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.users,
as: 'recipient',
where: filter.recipient ? {
[Op.or]: [
{ id: { [Op.in]: filter.recipient.split('|').map(term => Utils.uuid(term)) } },
{
firstName: {
[Op.or]: filter.recipient.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.subject) {
where = {
...where,
[Op.and]: Utils.ilike(
'messages',
'subject',
filter.subject,
),
};
}
if (filter.body) {
where = {
...where,
[Op.and]: Utils.ilike(
'messages',
'body',
filter.body,
),
};
}
if (filter.sent_atRange) {
const [start, end] = filter.sent_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
sent_at: {
...where.sent_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
sent_at: {
...where.sent_at,
[Op.lte]: end,
},
};
}
}
if (filter.read_atRange) {
const [start, end] = filter.read_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
read_at: {
...where.read_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
read_at: {
...where.read_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.status) {
where = {
...where,
status: filter.status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.messages.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'messages',
'subject',
query,
),
],
};
}
const records = await db.messages.findAll({
attributes: [ 'id', 'subject' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['subject', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.subject,
}));
}
};

View File

@ -0,0 +1,689 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class News_postsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const news_posts = await db.news_posts.create(
{
id: data.id || undefined,
title_om: data.title_om
||
null
,
title_am: data.title_am
||
null
,
title_en: data.title_en
||
null
,
excerpt_om: data.excerpt_om
||
null
,
excerpt_am: data.excerpt_am
||
null
,
excerpt_en: data.excerpt_en
||
null
,
content_om: data.content_om
||
null
,
content_am: data.content_am
||
null
,
content_en: data.content_en
||
null
,
status: data.status
||
null
,
published_at: data.published_at
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await news_posts.setAuthor( data.author || null, {
transaction,
});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.news_posts.getTableName(),
belongsToColumn: 'featured_image',
belongsToId: news_posts.id,
},
data.featured_image,
options,
);
return news_posts;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const news_postsData = data.map((item, index) => ({
id: item.id || undefined,
title_om: item.title_om
||
null
,
title_am: item.title_am
||
null
,
title_en: item.title_en
||
null
,
excerpt_om: item.excerpt_om
||
null
,
excerpt_am: item.excerpt_am
||
null
,
excerpt_en: item.excerpt_en
||
null
,
content_om: item.content_om
||
null
,
content_am: item.content_am
||
null
,
content_en: item.content_en
||
null
,
status: item.status
||
null
,
published_at: item.published_at
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const news_posts = await db.news_posts.bulkCreate(news_postsData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < news_posts.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.news_posts.getTableName(),
belongsToColumn: 'featured_image',
belongsToId: news_posts[i].id,
},
data[i].featured_image,
options,
);
}
return news_posts;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const news_posts = await db.news_posts.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.title_om !== undefined) updatePayload.title_om = data.title_om;
if (data.title_am !== undefined) updatePayload.title_am = data.title_am;
if (data.title_en !== undefined) updatePayload.title_en = data.title_en;
if (data.excerpt_om !== undefined) updatePayload.excerpt_om = data.excerpt_om;
if (data.excerpt_am !== undefined) updatePayload.excerpt_am = data.excerpt_am;
if (data.excerpt_en !== undefined) updatePayload.excerpt_en = data.excerpt_en;
if (data.content_om !== undefined) updatePayload.content_om = data.content_om;
if (data.content_am !== undefined) updatePayload.content_am = data.content_am;
if (data.content_en !== undefined) updatePayload.content_en = data.content_en;
if (data.status !== undefined) updatePayload.status = data.status;
if (data.published_at !== undefined) updatePayload.published_at = data.published_at;
updatePayload.updatedById = currentUser.id;
await news_posts.update(updatePayload, {transaction});
if (data.author !== undefined) {
await news_posts.setAuthor(
data.author,
{ transaction }
);
}
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.news_posts.getTableName(),
belongsToColumn: 'featured_image',
belongsToId: news_posts.id,
},
data.featured_image,
options,
);
return news_posts;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const news_posts = await db.news_posts.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of news_posts) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of news_posts) {
await record.destroy({transaction});
}
});
return news_posts;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const news_posts = await db.news_posts.findByPk(id, options);
await news_posts.update({
deletedBy: currentUser.id
}, {
transaction,
});
await news_posts.destroy({
transaction
});
return news_posts;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const news_posts = await db.news_posts.findOne(
{ where },
{ transaction },
);
if (!news_posts) {
return news_posts;
}
const output = news_posts.get({plain: true});
output.featured_image = await news_posts.getFeatured_image({
transaction
});
output.author = await news_posts.getAuthor({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.staff_members,
as: 'author',
where: filter.author ? {
[Op.or]: [
{ id: { [Op.in]: filter.author.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.author.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.file,
as: 'featured_image',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.title_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'news_posts',
'title_om',
filter.title_om,
),
};
}
if (filter.title_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'news_posts',
'title_am',
filter.title_am,
),
};
}
if (filter.title_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'news_posts',
'title_en',
filter.title_en,
),
};
}
if (filter.excerpt_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'news_posts',
'excerpt_om',
filter.excerpt_om,
),
};
}
if (filter.excerpt_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'news_posts',
'excerpt_am',
filter.excerpt_am,
),
};
}
if (filter.excerpt_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'news_posts',
'excerpt_en',
filter.excerpt_en,
),
};
}
if (filter.content_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'news_posts',
'content_om',
filter.content_om,
),
};
}
if (filter.content_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'news_posts',
'content_am',
filter.content_am,
),
};
}
if (filter.content_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'news_posts',
'content_en',
filter.content_en,
),
};
}
if (filter.published_atRange) {
const [start, end] = filter.published_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
published_at: {
...where.published_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
published_at: {
...where.published_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.status) {
where = {
...where,
status: filter.status,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.news_posts.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'news_posts',
'title_en',
query,
),
],
};
}
const records = await db.news_posts.findAll({
attributes: [ 'id', 'title_en' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['title_en', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.title_en,
}));
}
};

602
backend/src/db/api/pages.js Normal file
View File

@ -0,0 +1,602 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class PagesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const pages = await db.pages.create(
{
id: data.id || undefined,
page_type: data.page_type
||
null
,
title_om: data.title_om
||
null
,
title_am: data.title_am
||
null
,
title_en: data.title_en
||
null
,
content_om: data.content_om
||
null
,
content_am: data.content_am
||
null
,
content_en: data.content_en
||
null
,
published: data.published
||
false
,
published_at: data.published_at
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.pages.getTableName(),
belongsToColumn: 'cover_image',
belongsToId: pages.id,
},
data.cover_image,
options,
);
return pages;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const pagesData = data.map((item, index) => ({
id: item.id || undefined,
page_type: item.page_type
||
null
,
title_om: item.title_om
||
null
,
title_am: item.title_am
||
null
,
title_en: item.title_en
||
null
,
content_om: item.content_om
||
null
,
content_am: item.content_am
||
null
,
content_en: item.content_en
||
null
,
published: item.published
||
false
,
published_at: item.published_at
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const pages = await db.pages.bulkCreate(pagesData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < pages.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.pages.getTableName(),
belongsToColumn: 'cover_image',
belongsToId: pages[i].id,
},
data[i].cover_image,
options,
);
}
return pages;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const pages = await db.pages.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.page_type !== undefined) updatePayload.page_type = data.page_type;
if (data.title_om !== undefined) updatePayload.title_om = data.title_om;
if (data.title_am !== undefined) updatePayload.title_am = data.title_am;
if (data.title_en !== undefined) updatePayload.title_en = data.title_en;
if (data.content_om !== undefined) updatePayload.content_om = data.content_om;
if (data.content_am !== undefined) updatePayload.content_am = data.content_am;
if (data.content_en !== undefined) updatePayload.content_en = data.content_en;
if (data.published !== undefined) updatePayload.published = data.published;
if (data.published_at !== undefined) updatePayload.published_at = data.published_at;
updatePayload.updatedById = currentUser.id;
await pages.update(updatePayload, {transaction});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.pages.getTableName(),
belongsToColumn: 'cover_image',
belongsToId: pages.id,
},
data.cover_image,
options,
);
return pages;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const pages = await db.pages.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of pages) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of pages) {
await record.destroy({transaction});
}
});
return pages;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const pages = await db.pages.findByPk(id, options);
await pages.update({
deletedBy: currentUser.id
}, {
transaction,
});
await pages.destroy({
transaction
});
return pages;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const pages = await db.pages.findOne(
{ where },
{ transaction },
);
if (!pages) {
return pages;
}
const output = pages.get({plain: true});
output.cover_image = await pages.getCover_image({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.file,
as: 'cover_image',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.title_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'pages',
'title_om',
filter.title_om,
),
};
}
if (filter.title_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'pages',
'title_am',
filter.title_am,
),
};
}
if (filter.title_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'pages',
'title_en',
filter.title_en,
),
};
}
if (filter.content_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'pages',
'content_om',
filter.content_om,
),
};
}
if (filter.content_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'pages',
'content_am',
filter.content_am,
),
};
}
if (filter.content_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'pages',
'content_en',
filter.content_en,
),
};
}
if (filter.published_atRange) {
const [start, end] = filter.published_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
published_at: {
...where.published_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
published_at: {
...where.published_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.page_type) {
where = {
...where,
page_type: filter.page_type,
};
}
if (filter.published) {
where = {
...where,
published: filter.published,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.pages.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'pages',
'title_en',
query,
),
],
};
}
const records = await db.pages.findAll({
attributes: [ 'id', 'title_en' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['title_en', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.title_en,
}));
}
};

View File

@ -0,0 +1,361 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class PermissionsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const permissions = await db.permissions.create(
{
id: data.id || undefined,
name: data.name
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
return permissions;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const permissionsData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const permissions = await db.permissions.bulkCreate(permissionsData, { transaction });
// For each item created, replace relation files
return permissions;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const permissions = await db.permissions.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
updatePayload.updatedById = currentUser.id;
await permissions.update(updatePayload, {transaction});
return permissions;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const permissions = await db.permissions.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of permissions) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of permissions) {
await record.destroy({transaction});
}
});
return permissions;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const permissions = await db.permissions.findByPk(id, options);
await permissions.update({
deletedBy: currentUser.id
}, {
transaction,
});
await permissions.destroy({
transaction
});
return permissions;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const permissions = await db.permissions.findOne(
{ where },
{ transaction },
);
if (!permissions) {
return permissions;
}
const output = permissions.get({plain: true});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'permissions',
'name',
filter.name,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.permissions.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'permissions',
'name',
query,
),
],
};
}
const records = await db.permissions.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

431
backend/src/db/api/roles.js Normal file
View File

@ -0,0 +1,431 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class RolesDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const roles = await db.roles.create(
{
id: data.id || undefined,
name: data.name
||
null
,
role_customization: data.role_customization
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await roles.setPermissions(data.permissions || [], {
transaction,
});
return roles;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const rolesData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
role_customization: item.role_customization
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const roles = await db.roles.bulkCreate(rolesData, { transaction });
// For each item created, replace relation files
return roles;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const roles = await db.roles.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.role_customization !== undefined) updatePayload.role_customization = data.role_customization;
updatePayload.updatedById = currentUser.id;
await roles.update(updatePayload, {transaction});
if (data.permissions !== undefined) {
await roles.setPermissions(data.permissions, { transaction });
}
return roles;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const roles = await db.roles.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of roles) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of roles) {
await record.destroy({transaction});
}
});
return roles;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const roles = await db.roles.findByPk(id, options);
await roles.update({
deletedBy: currentUser.id
}, {
transaction,
});
await roles.destroy({
transaction
});
return roles;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const roles = await db.roles.findOne(
{ where },
{ transaction },
);
if (!roles) {
return roles;
}
const output = roles.get({plain: true});
output.users_app_role = await roles.getUsers_app_role({
transaction
});
output.permissions = await roles.getPermissions({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.permissions,
as: 'permissions',
required: false,
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'roles',
'name',
filter.name,
),
};
}
if (filter.role_customization) {
where = {
...where,
[Op.and]: Utils.ilike(
'roles',
'role_customization',
filter.role_customization,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.permissions) {
const searchTerms = filter.permissions.split('|');
include = [
{
model: db.permissions,
as: 'permissions_filter',
required: searchTerms.length > 0,
where: searchTerms.length > 0 ? {
[Op.or]: [
{ id: { [Op.in]: searchTerms.map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: searchTerms.map(term => ({ [Op.iLike]: `%${term}%` }))
}
}
]
} : undefined
},
...include,
]
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.roles.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'roles',
'name',
query,
),
],
};
}
const records = await db.roles.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,685 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class School_settingsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const school_settings = await db.school_settings.create(
{
id: data.id || undefined,
school_name_om: data.school_name_om
||
null
,
school_name_am: data.school_name_am
||
null
,
school_name_en: data.school_name_en
||
null
,
welcome_message_om: data.welcome_message_om
||
null
,
welcome_message_am: data.welcome_message_am
||
null
,
welcome_message_en: data.welcome_message_en
||
null
,
address_text: data.address_text
||
null
,
map_embed_url: data.map_embed_url
||
null
,
public_phone_numbers: data.public_phone_numbers
||
null
,
public_emails: data.public_emails
||
null
,
social_links: data.social_links
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.school_settings.getTableName(),
belongsToColumn: 'logo',
belongsToId: school_settings.id,
},
data.logo,
options,
);
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.school_settings.getTableName(),
belongsToColumn: 'hero_images',
belongsToId: school_settings.id,
},
data.hero_images,
options,
);
return school_settings;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const school_settingsData = data.map((item, index) => ({
id: item.id || undefined,
school_name_om: item.school_name_om
||
null
,
school_name_am: item.school_name_am
||
null
,
school_name_en: item.school_name_en
||
null
,
welcome_message_om: item.welcome_message_om
||
null
,
welcome_message_am: item.welcome_message_am
||
null
,
welcome_message_en: item.welcome_message_en
||
null
,
address_text: item.address_text
||
null
,
map_embed_url: item.map_embed_url
||
null
,
public_phone_numbers: item.public_phone_numbers
||
null
,
public_emails: item.public_emails
||
null
,
social_links: item.social_links
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const school_settings = await db.school_settings.bulkCreate(school_settingsData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < school_settings.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.school_settings.getTableName(),
belongsToColumn: 'logo',
belongsToId: school_settings[i].id,
},
data[i].logo,
options,
);
}
for (let i = 0; i < school_settings.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.school_settings.getTableName(),
belongsToColumn: 'hero_images',
belongsToId: school_settings[i].id,
},
data[i].hero_images,
options,
);
}
return school_settings;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const school_settings = await db.school_settings.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.school_name_om !== undefined) updatePayload.school_name_om = data.school_name_om;
if (data.school_name_am !== undefined) updatePayload.school_name_am = data.school_name_am;
if (data.school_name_en !== undefined) updatePayload.school_name_en = data.school_name_en;
if (data.welcome_message_om !== undefined) updatePayload.welcome_message_om = data.welcome_message_om;
if (data.welcome_message_am !== undefined) updatePayload.welcome_message_am = data.welcome_message_am;
if (data.welcome_message_en !== undefined) updatePayload.welcome_message_en = data.welcome_message_en;
if (data.address_text !== undefined) updatePayload.address_text = data.address_text;
if (data.map_embed_url !== undefined) updatePayload.map_embed_url = data.map_embed_url;
if (data.public_phone_numbers !== undefined) updatePayload.public_phone_numbers = data.public_phone_numbers;
if (data.public_emails !== undefined) updatePayload.public_emails = data.public_emails;
if (data.social_links !== undefined) updatePayload.social_links = data.social_links;
updatePayload.updatedById = currentUser.id;
await school_settings.update(updatePayload, {transaction});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.school_settings.getTableName(),
belongsToColumn: 'logo',
belongsToId: school_settings.id,
},
data.logo,
options,
);
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.school_settings.getTableName(),
belongsToColumn: 'hero_images',
belongsToId: school_settings.id,
},
data.hero_images,
options,
);
return school_settings;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const school_settings = await db.school_settings.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of school_settings) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of school_settings) {
await record.destroy({transaction});
}
});
return school_settings;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const school_settings = await db.school_settings.findByPk(id, options);
await school_settings.update({
deletedBy: currentUser.id
}, {
transaction,
});
await school_settings.destroy({
transaction
});
return school_settings;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const school_settings = await db.school_settings.findOne(
{ where },
{ transaction },
);
if (!school_settings) {
return school_settings;
}
const output = school_settings.get({plain: true});
output.logo = await school_settings.getLogo({
transaction
});
output.hero_images = await school_settings.getHero_images({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.file,
as: 'logo',
},
{
model: db.file,
as: 'hero_images',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.school_name_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'school_name_om',
filter.school_name_om,
),
};
}
if (filter.school_name_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'school_name_am',
filter.school_name_am,
),
};
}
if (filter.school_name_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'school_name_en',
filter.school_name_en,
),
};
}
if (filter.welcome_message_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'welcome_message_om',
filter.welcome_message_om,
),
};
}
if (filter.welcome_message_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'welcome_message_am',
filter.welcome_message_am,
),
};
}
if (filter.welcome_message_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'welcome_message_en',
filter.welcome_message_en,
),
};
}
if (filter.address_text) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'address_text',
filter.address_text,
),
};
}
if (filter.map_embed_url) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'map_embed_url',
filter.map_embed_url,
),
};
}
if (filter.public_phone_numbers) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'public_phone_numbers',
filter.public_phone_numbers,
),
};
}
if (filter.public_emails) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'public_emails',
filter.public_emails,
),
};
}
if (filter.social_links) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_settings',
'social_links',
filter.social_links,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.school_settings.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'school_settings',
'school_name_en',
query,
),
],
};
}
const records = await db.school_settings.findAll({
attributes: [ 'id', 'school_name_en' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['school_name_en', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.school_name_en,
}));
}
};

View File

@ -0,0 +1,509 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class School_statisticsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const school_statistics = await db.school_statistics.create(
{
id: data.id || undefined,
student_count: data.student_count
||
null
,
teacher_count: data.teacher_count
||
null
,
pass_rate: data.pass_rate
||
null
,
notes: data.notes
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await school_statistics.setSchool_year( data.school_year || null, {
transaction,
});
return school_statistics;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const school_statisticsData = data.map((item, index) => ({
id: item.id || undefined,
student_count: item.student_count
||
null
,
teacher_count: item.teacher_count
||
null
,
pass_rate: item.pass_rate
||
null
,
notes: item.notes
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const school_statistics = await db.school_statistics.bulkCreate(school_statisticsData, { transaction });
// For each item created, replace relation files
return school_statistics;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const school_statistics = await db.school_statistics.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.student_count !== undefined) updatePayload.student_count = data.student_count;
if (data.teacher_count !== undefined) updatePayload.teacher_count = data.teacher_count;
if (data.pass_rate !== undefined) updatePayload.pass_rate = data.pass_rate;
if (data.notes !== undefined) updatePayload.notes = data.notes;
updatePayload.updatedById = currentUser.id;
await school_statistics.update(updatePayload, {transaction});
if (data.school_year !== undefined) {
await school_statistics.setSchool_year(
data.school_year,
{ transaction }
);
}
return school_statistics;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const school_statistics = await db.school_statistics.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of school_statistics) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of school_statistics) {
await record.destroy({transaction});
}
});
return school_statistics;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const school_statistics = await db.school_statistics.findByPk(id, options);
await school_statistics.update({
deletedBy: currentUser.id
}, {
transaction,
});
await school_statistics.destroy({
transaction
});
return school_statistics;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const school_statistics = await db.school_statistics.findOne(
{ where },
{ transaction },
);
if (!school_statistics) {
return school_statistics;
}
const output = school_statistics.get({plain: true});
output.school_year = await school_statistics.getSchool_year({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.school_years,
as: 'school_year',
where: filter.school_year ? {
[Op.or]: [
{ id: { [Op.in]: filter.school_year.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.school_year.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.notes) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_statistics',
'notes',
filter.notes,
),
};
}
if (filter.student_countRange) {
const [start, end] = filter.student_countRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
student_count: {
...where.student_count,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
student_count: {
...where.student_count,
[Op.lte]: end,
},
};
}
}
if (filter.teacher_countRange) {
const [start, end] = filter.teacher_countRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
teacher_count: {
...where.teacher_count,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
teacher_count: {
...where.teacher_count,
[Op.lte]: end,
},
};
}
}
if (filter.pass_rateRange) {
const [start, end] = filter.pass_rateRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
pass_rate: {
...where.pass_rate,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
pass_rate: {
...where.pass_rate,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.school_statistics.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'school_statistics',
'notes',
query,
),
],
};
}
const records = await db.school_statistics.findAll({
attributes: [ 'id', 'notes' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['notes', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.notes,
}));
}
};

View File

@ -0,0 +1,491 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class School_yearsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const school_years = await db.school_years.create(
{
id: data.id || undefined,
name: data.name
||
null
,
starts_at: data.starts_at
||
null
,
ends_at: data.ends_at
||
null
,
current: data.current
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
return school_years;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const school_yearsData = data.map((item, index) => ({
id: item.id || undefined,
name: item.name
||
null
,
starts_at: item.starts_at
||
null
,
ends_at: item.ends_at
||
null
,
current: item.current
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const school_years = await db.school_years.bulkCreate(school_yearsData, { transaction });
// For each item created, replace relation files
return school_years;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const school_years = await db.school_years.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name !== undefined) updatePayload.name = data.name;
if (data.starts_at !== undefined) updatePayload.starts_at = data.starts_at;
if (data.ends_at !== undefined) updatePayload.ends_at = data.ends_at;
if (data.current !== undefined) updatePayload.current = data.current;
updatePayload.updatedById = currentUser.id;
await school_years.update(updatePayload, {transaction});
return school_years;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const school_years = await db.school_years.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of school_years) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of school_years) {
await record.destroy({transaction});
}
});
return school_years;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const school_years = await db.school_years.findByPk(id, options);
await school_years.update({
deletedBy: currentUser.id
}, {
transaction,
});
await school_years.destroy({
transaction
});
return school_years;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const school_years = await db.school_years.findOne(
{ where },
{ transaction },
);
if (!school_years) {
return school_years;
}
const output = school_years.get({plain: true});
output.class_sections_school_year = await school_years.getClass_sections_school_year({
transaction
});
output.terms_school_year = await school_years.getTerms_school_year({
transaction
});
output.exams_school_year = await school_years.getExams_school_year({
transaction
});
output.school_statistics_school_year = await school_years.getSchool_statistics_school_year({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name) {
where = {
...where,
[Op.and]: Utils.ilike(
'school_years',
'name',
filter.name,
),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
starts_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
ends_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.starts_atRange) {
const [start, end] = filter.starts_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
starts_at: {
...where.starts_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
starts_at: {
...where.starts_at,
[Op.lte]: end,
},
};
}
}
if (filter.ends_atRange) {
const [start, end] = filter.ends_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
ends_at: {
...where.ends_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
ends_at: {
...where.ends_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.current) {
where = {
...where,
current: filter.current,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.school_years.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'school_years',
'name',
query,
),
],
};
}
const records = await db.school_years.findAll({
attributes: [ 'id', 'name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name,
}));
}
};

View File

@ -0,0 +1,632 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Staff_membersDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const staff_members = await db.staff_members.create(
{
id: data.id || undefined,
full_name: data.full_name
||
null
,
staff_role: data.staff_role
||
null
,
bio: data.bio
||
null
,
phone_number: data.phone_number
||
null
,
email: data.email
||
null
,
public_profile: data.public_profile
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await staff_members.setUser( data.user || null, {
transaction,
});
await staff_members.setSubjects_taught(data.subjects_taught || [], {
transaction,
});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.staff_members.getTableName(),
belongsToColumn: 'photo',
belongsToId: staff_members.id,
},
data.photo,
options,
);
return staff_members;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const staff_membersData = data.map((item, index) => ({
id: item.id || undefined,
full_name: item.full_name
||
null
,
staff_role: item.staff_role
||
null
,
bio: item.bio
||
null
,
phone_number: item.phone_number
||
null
,
email: item.email
||
null
,
public_profile: item.public_profile
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const staff_members = await db.staff_members.bulkCreate(staff_membersData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < staff_members.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.staff_members.getTableName(),
belongsToColumn: 'photo',
belongsToId: staff_members[i].id,
},
data[i].photo,
options,
);
}
return staff_members;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const staff_members = await db.staff_members.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.full_name !== undefined) updatePayload.full_name = data.full_name;
if (data.staff_role !== undefined) updatePayload.staff_role = data.staff_role;
if (data.bio !== undefined) updatePayload.bio = data.bio;
if (data.phone_number !== undefined) updatePayload.phone_number = data.phone_number;
if (data.email !== undefined) updatePayload.email = data.email;
if (data.public_profile !== undefined) updatePayload.public_profile = data.public_profile;
updatePayload.updatedById = currentUser.id;
await staff_members.update(updatePayload, {transaction});
if (data.user !== undefined) {
await staff_members.setUser(
data.user,
{ transaction }
);
}
if (data.subjects_taught !== undefined) {
await staff_members.setSubjects_taught(data.subjects_taught, { transaction });
}
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.staff_members.getTableName(),
belongsToColumn: 'photo',
belongsToId: staff_members.id,
},
data.photo,
options,
);
return staff_members;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const staff_members = await db.staff_members.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of staff_members) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of staff_members) {
await record.destroy({transaction});
}
});
return staff_members;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const staff_members = await db.staff_members.findByPk(id, options);
await staff_members.update({
deletedBy: currentUser.id
}, {
transaction,
});
await staff_members.destroy({
transaction
});
return staff_members;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const staff_members = await db.staff_members.findOne(
{ where },
{ transaction },
);
if (!staff_members) {
return staff_members;
}
const output = staff_members.get({plain: true});
output.subject_offerings_responsible_teacher = await staff_members.getSubject_offerings_responsible_teacher({
transaction
});
output.class_sections_homeroom_teacher = await staff_members.getClass_sections_homeroom_teacher({
transaction
});
output.news_posts_author = await staff_members.getNews_posts_author({
transaction
});
output.study_materials_uploaded_by = await staff_members.getStudy_materials_uploaded_by({
transaction
});
output.exam_results_entered_by = await staff_members.getExam_results_entered_by({
transaction
});
output.attendance_sessions_teacher = await staff_members.getAttendance_sessions_teacher({
transaction
});
output.class_schedules_teacher = await staff_members.getClass_schedules_teacher({
transaction
});
output.assignments_teacher = await staff_members.getAssignments_teacher({
transaction
});
output.lesson_plans_teacher = await staff_members.getLesson_plans_teacher({
transaction
});
output.user = await staff_members.getUser({
transaction
});
output.photo = await staff_members.getPhoto({
transaction
});
output.subjects_taught = await staff_members.getSubjects_taught({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.users,
as: 'user',
where: filter.user ? {
[Op.or]: [
{ id: { [Op.in]: filter.user.split('|').map(term => Utils.uuid(term)) } },
{
firstName: {
[Op.or]: filter.user.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.subjects,
as: 'subjects_taught',
required: false,
},
{
model: db.file,
as: 'photo',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.full_name) {
where = {
...where,
[Op.and]: Utils.ilike(
'staff_members',
'full_name',
filter.full_name,
),
};
}
if (filter.bio) {
where = {
...where,
[Op.and]: Utils.ilike(
'staff_members',
'bio',
filter.bio,
),
};
}
if (filter.phone_number) {
where = {
...where,
[Op.and]: Utils.ilike(
'staff_members',
'phone_number',
filter.phone_number,
),
};
}
if (filter.email) {
where = {
...where,
[Op.and]: Utils.ilike(
'staff_members',
'email',
filter.email,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.staff_role) {
where = {
...where,
staff_role: filter.staff_role,
};
}
if (filter.public_profile) {
where = {
...where,
public_profile: filter.public_profile,
};
}
if (filter.subjects_taught) {
const searchTerms = filter.subjects_taught.split('|');
include = [
{
model: db.subjects,
as: 'subjects_taught_filter',
required: searchTerms.length > 0,
where: searchTerms.length > 0 ? {
[Op.or]: [
{ id: { [Op.in]: searchTerms.map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: searchTerms.map(term => ({ [Op.iLike]: `%${term}%` }))
}
}
]
} : undefined
},
...include,
]
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.staff_members.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'staff_members',
'full_name',
query,
),
],
};
}
const records = await db.staff_members.findAll({
attributes: [ 'id', 'full_name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['full_name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.full_name,
}));
}
};

View File

@ -0,0 +1,453 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class StreamsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const streams = await db.streams.create(
{
id: data.id || undefined,
name_om: data.name_om
||
null
,
name_am: data.name_am
||
null
,
name_en: data.name_en
||
null
,
stream_type: data.stream_type
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
return streams;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const streamsData = data.map((item, index) => ({
id: item.id || undefined,
name_om: item.name_om
||
null
,
name_am: item.name_am
||
null
,
name_en: item.name_en
||
null
,
stream_type: item.stream_type
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const streams = await db.streams.bulkCreate(streamsData, { transaction });
// For each item created, replace relation files
return streams;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const streams = await db.streams.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name_om !== undefined) updatePayload.name_om = data.name_om;
if (data.name_am !== undefined) updatePayload.name_am = data.name_am;
if (data.name_en !== undefined) updatePayload.name_en = data.name_en;
if (data.stream_type !== undefined) updatePayload.stream_type = data.stream_type;
updatePayload.updatedById = currentUser.id;
await streams.update(updatePayload, {transaction});
return streams;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const streams = await db.streams.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of streams) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of streams) {
await record.destroy({transaction});
}
});
return streams;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const streams = await db.streams.findByPk(id, options);
await streams.update({
deletedBy: currentUser.id
}, {
transaction,
});
await streams.destroy({
transaction
});
return streams;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const streams = await db.streams.findOne(
{ where },
{ transaction },
);
if (!streams) {
return streams;
}
const output = streams.get({plain: true});
output.subject_offerings_stream = await streams.getSubject_offerings_stream({
transaction
});
output.students_current_stream = await streams.getStudents_current_stream({
transaction
});
output.class_sections_stream = await streams.getClass_sections_stream({
transaction
});
output.study_materials_stream = await streams.getStudy_materials_stream({
transaction
});
output.exams_stream = await streams.getExams_stream({
transaction
});
output.admission_applications_requested_stream = await streams.getAdmission_applications_requested_stream({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'streams',
'name_om',
filter.name_om,
),
};
}
if (filter.name_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'streams',
'name_am',
filter.name_am,
),
};
}
if (filter.name_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'streams',
'name_en',
filter.name_en,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.stream_type) {
where = {
...where,
stream_type: filter.stream_type,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.streams.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'streams',
'name_en',
query,
),
],
};
}
const records = await db.streams.findAll({
attributes: [ 'id', 'name_en' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name_en', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name_en,
}));
}
};

View File

@ -0,0 +1,667 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class StudentsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const students = await db.students.create(
{
id: data.id || undefined,
full_name: data.full_name
||
null
,
student_code: data.student_code
||
null
,
gender: data.gender
||
null
,
date_of_birth: data.date_of_birth
||
null
,
guardian_name: data.guardian_name
||
null
,
guardian_phone: data.guardian_phone
||
null
,
address_text: data.address_text
||
null
,
active: data.active
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await students.setUser( data.user || null, {
transaction,
});
await students.setCurrent_grade( data.current_grade || null, {
transaction,
});
await students.setCurrent_stream( data.current_stream || null, {
transaction,
});
return students;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const studentsData = data.map((item, index) => ({
id: item.id || undefined,
full_name: item.full_name
||
null
,
student_code: item.student_code
||
null
,
gender: item.gender
||
null
,
date_of_birth: item.date_of_birth
||
null
,
guardian_name: item.guardian_name
||
null
,
guardian_phone: item.guardian_phone
||
null
,
address_text: item.address_text
||
null
,
active: item.active
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const students = await db.students.bulkCreate(studentsData, { transaction });
// For each item created, replace relation files
return students;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const students = await db.students.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.full_name !== undefined) updatePayload.full_name = data.full_name;
if (data.student_code !== undefined) updatePayload.student_code = data.student_code;
if (data.gender !== undefined) updatePayload.gender = data.gender;
if (data.date_of_birth !== undefined) updatePayload.date_of_birth = data.date_of_birth;
if (data.guardian_name !== undefined) updatePayload.guardian_name = data.guardian_name;
if (data.guardian_phone !== undefined) updatePayload.guardian_phone = data.guardian_phone;
if (data.address_text !== undefined) updatePayload.address_text = data.address_text;
if (data.active !== undefined) updatePayload.active = data.active;
updatePayload.updatedById = currentUser.id;
await students.update(updatePayload, {transaction});
if (data.user !== undefined) {
await students.setUser(
data.user,
{ transaction }
);
}
if (data.current_grade !== undefined) {
await students.setCurrent_grade(
data.current_grade,
{ transaction }
);
}
if (data.current_stream !== undefined) {
await students.setCurrent_stream(
data.current_stream,
{ transaction }
);
}
return students;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const students = await db.students.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of students) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of students) {
await record.destroy({transaction});
}
});
return students;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const students = await db.students.findByPk(id, options);
await students.update({
deletedBy: currentUser.id
}, {
transaction,
});
await students.destroy({
transaction
});
return students;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const students = await db.students.findOne(
{ where },
{ transaction },
);
if (!students) {
return students;
}
const output = students.get({plain: true});
output.enrollments_student = await students.getEnrollments_student({
transaction
});
output.exam_results_student = await students.getExam_results_student({
transaction
});
output.attendance_records_student = await students.getAttendance_records_student({
transaction
});
output.assignment_submissions_student = await students.getAssignment_submissions_student({
transaction
});
output.top_student_features_student = await students.getTop_student_features_student({
transaction
});
output.user = await students.getUser({
transaction
});
output.current_grade = await students.getCurrent_grade({
transaction
});
output.current_stream = await students.getCurrent_stream({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.users,
as: 'user',
where: filter.user ? {
[Op.or]: [
{ id: { [Op.in]: filter.user.split('|').map(term => Utils.uuid(term)) } },
{
firstName: {
[Op.or]: filter.user.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.grades,
as: 'current_grade',
where: filter.current_grade ? {
[Op.or]: [
{ id: { [Op.in]: filter.current_grade.split('|').map(term => Utils.uuid(term)) } },
{
label: {
[Op.or]: filter.current_grade.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.streams,
as: 'current_stream',
where: filter.current_stream ? {
[Op.or]: [
{ id: { [Op.in]: filter.current_stream.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.current_stream.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.full_name) {
where = {
...where,
[Op.and]: Utils.ilike(
'students',
'full_name',
filter.full_name,
),
};
}
if (filter.student_code) {
where = {
...where,
[Op.and]: Utils.ilike(
'students',
'student_code',
filter.student_code,
),
};
}
if (filter.guardian_name) {
where = {
...where,
[Op.and]: Utils.ilike(
'students',
'guardian_name',
filter.guardian_name,
),
};
}
if (filter.guardian_phone) {
where = {
...where,
[Op.and]: Utils.ilike(
'students',
'guardian_phone',
filter.guardian_phone,
),
};
}
if (filter.address_text) {
where = {
...where,
[Op.and]: Utils.ilike(
'students',
'address_text',
filter.address_text,
),
};
}
if (filter.date_of_birthRange) {
const [start, end] = filter.date_of_birthRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
date_of_birth: {
...where.date_of_birth,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
date_of_birth: {
...where.date_of_birth,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.gender) {
where = {
...where,
gender: filter.gender,
};
}
if (filter.active) {
where = {
...where,
active: filter.active,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.students.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'students',
'full_name',
query,
),
],
};
}
const records = await db.students.findAll({
attributes: [ 'id', 'full_name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['full_name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.full_name,
}));
}
};

View File

@ -0,0 +1,724 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Study_materialsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const study_materials = await db.study_materials.create(
{
id: data.id || undefined,
title_om: data.title_om
||
null
,
title_am: data.title_am
||
null
,
title_en: data.title_en
||
null
,
material_type: data.material_type
||
null
,
description: data.description
||
null
,
external_url: data.external_url
||
null
,
visibility: data.visibility
||
null
,
published_at: data.published_at
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await study_materials.setGrade( data.grade || null, {
transaction,
});
await study_materials.setSubject( data.subject || null, {
transaction,
});
await study_materials.setStream( data.stream || null, {
transaction,
});
await study_materials.setUploaded_by( data.uploaded_by || null, {
transaction,
});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.study_materials.getTableName(),
belongsToColumn: 'file',
belongsToId: study_materials.id,
},
data.file,
options,
);
return study_materials;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const study_materialsData = data.map((item, index) => ({
id: item.id || undefined,
title_om: item.title_om
||
null
,
title_am: item.title_am
||
null
,
title_en: item.title_en
||
null
,
material_type: item.material_type
||
null
,
description: item.description
||
null
,
external_url: item.external_url
||
null
,
visibility: item.visibility
||
null
,
published_at: item.published_at
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const study_materials = await db.study_materials.bulkCreate(study_materialsData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < study_materials.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.study_materials.getTableName(),
belongsToColumn: 'file',
belongsToId: study_materials[i].id,
},
data[i].file,
options,
);
}
return study_materials;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const study_materials = await db.study_materials.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.title_om !== undefined) updatePayload.title_om = data.title_om;
if (data.title_am !== undefined) updatePayload.title_am = data.title_am;
if (data.title_en !== undefined) updatePayload.title_en = data.title_en;
if (data.material_type !== undefined) updatePayload.material_type = data.material_type;
if (data.description !== undefined) updatePayload.description = data.description;
if (data.external_url !== undefined) updatePayload.external_url = data.external_url;
if (data.visibility !== undefined) updatePayload.visibility = data.visibility;
if (data.published_at !== undefined) updatePayload.published_at = data.published_at;
updatePayload.updatedById = currentUser.id;
await study_materials.update(updatePayload, {transaction});
if (data.grade !== undefined) {
await study_materials.setGrade(
data.grade,
{ transaction }
);
}
if (data.subject !== undefined) {
await study_materials.setSubject(
data.subject,
{ transaction }
);
}
if (data.stream !== undefined) {
await study_materials.setStream(
data.stream,
{ transaction }
);
}
if (data.uploaded_by !== undefined) {
await study_materials.setUploaded_by(
data.uploaded_by,
{ transaction }
);
}
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.study_materials.getTableName(),
belongsToColumn: 'file',
belongsToId: study_materials.id,
},
data.file,
options,
);
return study_materials;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const study_materials = await db.study_materials.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of study_materials) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of study_materials) {
await record.destroy({transaction});
}
});
return study_materials;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const study_materials = await db.study_materials.findByPk(id, options);
await study_materials.update({
deletedBy: currentUser.id
}, {
transaction,
});
await study_materials.destroy({
transaction
});
return study_materials;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const study_materials = await db.study_materials.findOne(
{ where },
{ transaction },
);
if (!study_materials) {
return study_materials;
}
const output = study_materials.get({plain: true});
output.grade = await study_materials.getGrade({
transaction
});
output.subject = await study_materials.getSubject({
transaction
});
output.stream = await study_materials.getStream({
transaction
});
output.file = await study_materials.getFile({
transaction
});
output.uploaded_by = await study_materials.getUploaded_by({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.grades,
as: 'grade',
where: filter.grade ? {
[Op.or]: [
{ id: { [Op.in]: filter.grade.split('|').map(term => Utils.uuid(term)) } },
{
label: {
[Op.or]: filter.grade.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.subjects,
as: 'subject',
where: filter.subject ? {
[Op.or]: [
{ id: { [Op.in]: filter.subject.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.subject.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.streams,
as: 'stream',
where: filter.stream ? {
[Op.or]: [
{ id: { [Op.in]: filter.stream.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.stream.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.staff_members,
as: 'uploaded_by',
where: filter.uploaded_by ? {
[Op.or]: [
{ id: { [Op.in]: filter.uploaded_by.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.uploaded_by.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.file,
as: 'file',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.title_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'study_materials',
'title_om',
filter.title_om,
),
};
}
if (filter.title_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'study_materials',
'title_am',
filter.title_am,
),
};
}
if (filter.title_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'study_materials',
'title_en',
filter.title_en,
),
};
}
if (filter.description) {
where = {
...where,
[Op.and]: Utils.ilike(
'study_materials',
'description',
filter.description,
),
};
}
if (filter.external_url) {
where = {
...where,
[Op.and]: Utils.ilike(
'study_materials',
'external_url',
filter.external_url,
),
};
}
if (filter.published_atRange) {
const [start, end] = filter.published_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
published_at: {
...where.published_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
published_at: {
...where.published_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.material_type) {
where = {
...where,
material_type: filter.material_type,
};
}
if (filter.visibility) {
where = {
...where,
visibility: filter.visibility,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.study_materials.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'study_materials',
'title_en',
query,
),
],
};
}
const records = await db.study_materials.findAll({
attributes: [ 'id', 'title_en' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['title_en', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.title_en,
}));
}
};

View File

@ -0,0 +1,581 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Subject_offeringsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const subject_offerings = await db.subject_offerings.create(
{
id: data.id || undefined,
learning_objectives_om: data.learning_objectives_om
||
null
,
learning_objectives_am: data.learning_objectives_am
||
null
,
learning_objectives_en: data.learning_objectives_en
||
null
,
topics_outline: data.topics_outline
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await subject_offerings.setSubject( data.subject || null, {
transaction,
});
await subject_offerings.setGrade( data.grade || null, {
transaction,
});
await subject_offerings.setStream( data.stream || null, {
transaction,
});
await subject_offerings.setResponsible_teacher( data.responsible_teacher || null, {
transaction,
});
return subject_offerings;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const subject_offeringsData = data.map((item, index) => ({
id: item.id || undefined,
learning_objectives_om: item.learning_objectives_om
||
null
,
learning_objectives_am: item.learning_objectives_am
||
null
,
learning_objectives_en: item.learning_objectives_en
||
null
,
topics_outline: item.topics_outline
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const subject_offerings = await db.subject_offerings.bulkCreate(subject_offeringsData, { transaction });
// For each item created, replace relation files
return subject_offerings;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const subject_offerings = await db.subject_offerings.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.learning_objectives_om !== undefined) updatePayload.learning_objectives_om = data.learning_objectives_om;
if (data.learning_objectives_am !== undefined) updatePayload.learning_objectives_am = data.learning_objectives_am;
if (data.learning_objectives_en !== undefined) updatePayload.learning_objectives_en = data.learning_objectives_en;
if (data.topics_outline !== undefined) updatePayload.topics_outline = data.topics_outline;
updatePayload.updatedById = currentUser.id;
await subject_offerings.update(updatePayload, {transaction});
if (data.subject !== undefined) {
await subject_offerings.setSubject(
data.subject,
{ transaction }
);
}
if (data.grade !== undefined) {
await subject_offerings.setGrade(
data.grade,
{ transaction }
);
}
if (data.stream !== undefined) {
await subject_offerings.setStream(
data.stream,
{ transaction }
);
}
if (data.responsible_teacher !== undefined) {
await subject_offerings.setResponsible_teacher(
data.responsible_teacher,
{ transaction }
);
}
return subject_offerings;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const subject_offerings = await db.subject_offerings.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of subject_offerings) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of subject_offerings) {
await record.destroy({transaction});
}
});
return subject_offerings;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const subject_offerings = await db.subject_offerings.findByPk(id, options);
await subject_offerings.update({
deletedBy: currentUser.id
}, {
transaction,
});
await subject_offerings.destroy({
transaction
});
return subject_offerings;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const subject_offerings = await db.subject_offerings.findOne(
{ where },
{ transaction },
);
if (!subject_offerings) {
return subject_offerings;
}
const output = subject_offerings.get({plain: true});
output.subject = await subject_offerings.getSubject({
transaction
});
output.grade = await subject_offerings.getGrade({
transaction
});
output.stream = await subject_offerings.getStream({
transaction
});
output.responsible_teacher = await subject_offerings.getResponsible_teacher({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.subjects,
as: 'subject',
where: filter.subject ? {
[Op.or]: [
{ id: { [Op.in]: filter.subject.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.subject.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.grades,
as: 'grade',
where: filter.grade ? {
[Op.or]: [
{ id: { [Op.in]: filter.grade.split('|').map(term => Utils.uuid(term)) } },
{
label: {
[Op.or]: filter.grade.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.streams,
as: 'stream',
where: filter.stream ? {
[Op.or]: [
{ id: { [Op.in]: filter.stream.split('|').map(term => Utils.uuid(term)) } },
{
name_en: {
[Op.or]: filter.stream.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.staff_members,
as: 'responsible_teacher',
where: filter.responsible_teacher ? {
[Op.or]: [
{ id: { [Op.in]: filter.responsible_teacher.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.responsible_teacher.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.learning_objectives_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'subject_offerings',
'learning_objectives_om',
filter.learning_objectives_om,
),
};
}
if (filter.learning_objectives_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'subject_offerings',
'learning_objectives_am',
filter.learning_objectives_am,
),
};
}
if (filter.learning_objectives_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'subject_offerings',
'learning_objectives_en',
filter.learning_objectives_en,
),
};
}
if (filter.topics_outline) {
where = {
...where,
[Op.and]: Utils.ilike(
'subject_offerings',
'topics_outline',
filter.topics_outline,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.subject_offerings.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'subject_offerings',
'topics_outline',
query,
),
],
};
}
const records = await db.subject_offerings.findAll({
attributes: [ 'id', 'topics_outline' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['topics_outline', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.topics_outline,
}));
}
};

View File

@ -0,0 +1,555 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class SubjectsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const subjects = await db.subjects.create(
{
id: data.id || undefined,
name_om: data.name_om
||
null
,
name_am: data.name_am
||
null
,
name_en: data.name_en
||
null
,
overview_om: data.overview_om
||
null
,
overview_am: data.overview_am
||
null
,
overview_en: data.overview_en
||
null
,
textbook_references: data.textbook_references
||
null
,
active: data.active
||
false
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
return subjects;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const subjectsData = data.map((item, index) => ({
id: item.id || undefined,
name_om: item.name_om
||
null
,
name_am: item.name_am
||
null
,
name_en: item.name_en
||
null
,
overview_om: item.overview_om
||
null
,
overview_am: item.overview_am
||
null
,
overview_en: item.overview_en
||
null
,
textbook_references: item.textbook_references
||
null
,
active: item.active
||
false
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const subjects = await db.subjects.bulkCreate(subjectsData, { transaction });
// For each item created, replace relation files
return subjects;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const subjects = await db.subjects.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.name_om !== undefined) updatePayload.name_om = data.name_om;
if (data.name_am !== undefined) updatePayload.name_am = data.name_am;
if (data.name_en !== undefined) updatePayload.name_en = data.name_en;
if (data.overview_om !== undefined) updatePayload.overview_om = data.overview_om;
if (data.overview_am !== undefined) updatePayload.overview_am = data.overview_am;
if (data.overview_en !== undefined) updatePayload.overview_en = data.overview_en;
if (data.textbook_references !== undefined) updatePayload.textbook_references = data.textbook_references;
if (data.active !== undefined) updatePayload.active = data.active;
updatePayload.updatedById = currentUser.id;
await subjects.update(updatePayload, {transaction});
return subjects;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const subjects = await db.subjects.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of subjects) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of subjects) {
await record.destroy({transaction});
}
});
return subjects;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const subjects = await db.subjects.findByPk(id, options);
await subjects.update({
deletedBy: currentUser.id
}, {
transaction,
});
await subjects.destroy({
transaction
});
return subjects;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const subjects = await db.subjects.findOne(
{ where },
{ transaction },
);
if (!subjects) {
return subjects;
}
const output = subjects.get({plain: true});
output.subject_offerings_subject = await subjects.getSubject_offerings_subject({
transaction
});
output.study_materials_subject = await subjects.getStudy_materials_subject({
transaction
});
output.exam_results_subject = await subjects.getExam_results_subject({
transaction
});
output.attendance_sessions_subject = await subjects.getAttendance_sessions_subject({
transaction
});
output.class_schedules_subject = await subjects.getClass_schedules_subject({
transaction
});
output.assignments_subject = await subjects.getAssignments_subject({
transaction
});
output.lesson_plans_subject = await subjects.getLesson_plans_subject({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.name_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'subjects',
'name_om',
filter.name_om,
),
};
}
if (filter.name_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'subjects',
'name_am',
filter.name_am,
),
};
}
if (filter.name_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'subjects',
'name_en',
filter.name_en,
),
};
}
if (filter.overview_om) {
where = {
...where,
[Op.and]: Utils.ilike(
'subjects',
'overview_om',
filter.overview_om,
),
};
}
if (filter.overview_am) {
where = {
...where,
[Op.and]: Utils.ilike(
'subjects',
'overview_am',
filter.overview_am,
),
};
}
if (filter.overview_en) {
where = {
...where,
[Op.and]: Utils.ilike(
'subjects',
'overview_en',
filter.overview_en,
),
};
}
if (filter.textbook_references) {
where = {
...where,
[Op.and]: Utils.ilike(
'subjects',
'textbook_references',
filter.textbook_references,
),
};
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.active) {
where = {
...where,
active: filter.active,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.subjects.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'subjects',
'name_en',
query,
),
],
};
}
const records = await db.subjects.findAll({
attributes: [ 'id', 'name_en' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['name_en', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.name_en,
}));
}
};

490
backend/src/db/api/terms.js Normal file
View File

@ -0,0 +1,490 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class TermsDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const terms = await db.terms.create(
{
id: data.id || undefined,
term_name: data.term_name
||
null
,
starts_at: data.starts_at
||
null
,
ends_at: data.ends_at
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await terms.setSchool_year( data.school_year || null, {
transaction,
});
return terms;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const termsData = data.map((item, index) => ({
id: item.id || undefined,
term_name: item.term_name
||
null
,
starts_at: item.starts_at
||
null
,
ends_at: item.ends_at
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const terms = await db.terms.bulkCreate(termsData, { transaction });
// For each item created, replace relation files
return terms;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const terms = await db.terms.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.term_name !== undefined) updatePayload.term_name = data.term_name;
if (data.starts_at !== undefined) updatePayload.starts_at = data.starts_at;
if (data.ends_at !== undefined) updatePayload.ends_at = data.ends_at;
updatePayload.updatedById = currentUser.id;
await terms.update(updatePayload, {transaction});
if (data.school_year !== undefined) {
await terms.setSchool_year(
data.school_year,
{ transaction }
);
}
return terms;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const terms = await db.terms.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of terms) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of terms) {
await record.destroy({transaction});
}
});
return terms;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const terms = await db.terms.findByPk(id, options);
await terms.update({
deletedBy: currentUser.id
}, {
transaction,
});
await terms.destroy({
transaction
});
return terms;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const terms = await db.terms.findOne(
{ where },
{ transaction },
);
if (!terms) {
return terms;
}
const output = terms.get({plain: true});
output.exams_term = await terms.getExams_term({
transaction
});
output.school_year = await terms.getSchool_year({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.school_years,
as: 'school_year',
where: filter.school_year ? {
[Op.or]: [
{ id: { [Op.in]: filter.school_year.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.school_year.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.calendarStart && filter.calendarEnd) {
where = {
...where,
[Op.or]: [
{
starts_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
{
ends_at: {
[Op.between]: [filter.calendarStart, filter.calendarEnd],
},
},
],
};
}
if (filter.starts_atRange) {
const [start, end] = filter.starts_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
starts_at: {
...where.starts_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
starts_at: {
...where.starts_at,
[Op.lte]: end,
},
};
}
}
if (filter.ends_atRange) {
const [start, end] = filter.ends_atRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
ends_at: {
...where.ends_at,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
ends_at: {
...where.ends_at,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.term_name) {
where = {
...where,
term_name: filter.term_name,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.terms.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'terms',
'term_name',
query,
),
],
};
}
const records = await db.terms.findAll({
attributes: [ 'id', 'term_name' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['term_name', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.term_name,
}));
}
};

View File

@ -0,0 +1,553 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class Top_student_featuresDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const top_student_features = await db.top_student_features.create(
{
id: data.id || undefined,
total_score: data.total_score
||
null
,
rank: data.rank
||
null
,
consent_to_publish: data.consent_to_publish
||
false
,
published: data.published
||
false
,
note: data.note
||
null
,
importHash: data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
await top_student_features.setExam( data.exam || null, {
transaction,
});
await top_student_features.setStudent( data.student || null, {
transaction,
});
return top_student_features;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const top_student_featuresData = data.map((item, index) => ({
id: item.id || undefined,
total_score: item.total_score
||
null
,
rank: item.rank
||
null
,
consent_to_publish: item.consent_to_publish
||
false
,
published: item.published
||
false
,
note: item.note
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const top_student_features = await db.top_student_features.bulkCreate(top_student_featuresData, { transaction });
// For each item created, replace relation files
return top_student_features;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const top_student_features = await db.top_student_features.findByPk(id, {}, {transaction});
const updatePayload = {};
if (data.total_score !== undefined) updatePayload.total_score = data.total_score;
if (data.rank !== undefined) updatePayload.rank = data.rank;
if (data.consent_to_publish !== undefined) updatePayload.consent_to_publish = data.consent_to_publish;
if (data.published !== undefined) updatePayload.published = data.published;
if (data.note !== undefined) updatePayload.note = data.note;
updatePayload.updatedById = currentUser.id;
await top_student_features.update(updatePayload, {transaction});
if (data.exam !== undefined) {
await top_student_features.setExam(
data.exam,
{ transaction }
);
}
if (data.student !== undefined) {
await top_student_features.setStudent(
data.student,
{ transaction }
);
}
return top_student_features;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const top_student_features = await db.top_student_features.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of top_student_features) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of top_student_features) {
await record.destroy({transaction});
}
});
return top_student_features;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const top_student_features = await db.top_student_features.findByPk(id, options);
await top_student_features.update({
deletedBy: currentUser.id
}, {
transaction,
});
await top_student_features.destroy({
transaction
});
return top_student_features;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const top_student_features = await db.top_student_features.findOne(
{ where },
{ transaction },
);
if (!top_student_features) {
return top_student_features;
}
const output = top_student_features.get({plain: true});
output.exam = await top_student_features.getExam({
transaction
});
output.student = await top_student_features.getStudent({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.exams,
as: 'exam',
where: filter.exam ? {
[Op.or]: [
{ id: { [Op.in]: filter.exam.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.exam.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.students,
as: 'student',
where: filter.student ? {
[Op.or]: [
{ id: { [Op.in]: filter.student.split('|').map(term => Utils.uuid(term)) } },
{
full_name: {
[Op.or]: filter.student.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.note) {
where = {
...where,
[Op.and]: Utils.ilike(
'top_student_features',
'note',
filter.note,
),
};
}
if (filter.total_scoreRange) {
const [start, end] = filter.total_scoreRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
total_score: {
...where.total_score,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
total_score: {
...where.total_score,
[Op.lte]: end,
},
};
}
}
if (filter.rankRange) {
const [start, end] = filter.rankRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
rank: {
...where.rank,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
rank: {
...where.rank,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.consent_to_publish) {
where = {
...where,
consent_to_publish: filter.consent_to_publish,
};
}
if (filter.published) {
where = {
...where,
published: filter.published,
};
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.top_student_features.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'top_student_features',
'note',
query,
),
],
};
}
const records = await db.top_student_features.findAll({
attributes: [ 'id', 'note' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['note', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.note,
}));
}
};

975
backend/src/db/api/users.js Normal file
View File

@ -0,0 +1,975 @@
const db = require('../models');
const FileDBApi = require('./file');
const crypto = require('crypto');
const Utils = require('../utils');
const bcrypt = require('bcrypt');
const config = require('../../config');
const Sequelize = db.Sequelize;
const Op = Sequelize.Op;
module.exports = class UsersDBApi {
static async create(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const users = await db.users.create(
{
id: data.data.id || undefined,
firstName: data.data.firstName
||
null
,
lastName: data.data.lastName
||
null
,
phoneNumber: data.data.phoneNumber
||
null
,
email: data.data.email
||
null
,
disabled: data.data.disabled
||
false
,
password: data.data.password
||
null
,
emailVerified: data.data.emailVerified
||
true
,
emailVerificationToken: data.data.emailVerificationToken
||
null
,
emailVerificationTokenExpiresAt: data.data.emailVerificationTokenExpiresAt
||
null
,
passwordResetToken: data.data.passwordResetToken
||
null
,
passwordResetTokenExpiresAt: data.data.passwordResetTokenExpiresAt
||
null
,
provider: data.data.provider
||
null
,
importHash: data.data.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
},
{ transaction },
);
if (!data.data.app_role) {
const role = await db.roles.findOne({
where: { name: 'User' },
});
if (role) {
await users.setApp_role(role, {
transaction,
});
}
}else{
await users.setApp_role(data.data.app_role || null, {
transaction,
});
}
await users.setCustom_permissions(data.data.custom_permissions || [], {
transaction,
});
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.users.getTableName(),
belongsToColumn: 'avatar',
belongsToId: users.id,
},
data.data.avatar,
options,
);
return users;
}
static async bulkImport(data, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
// Prepare data - wrapping individual data transformations in a map() method
const usersData = data.map((item, index) => ({
id: item.id || undefined,
firstName: item.firstName
||
null
,
lastName: item.lastName
||
null
,
phoneNumber: item.phoneNumber
||
null
,
email: item.email
||
null
,
disabled: item.disabled
||
false
,
password: item.password
||
null
,
emailVerified: item.emailVerified
||
false
,
emailVerificationToken: item.emailVerificationToken
||
null
,
emailVerificationTokenExpiresAt: item.emailVerificationTokenExpiresAt
||
null
,
passwordResetToken: item.passwordResetToken
||
null
,
passwordResetTokenExpiresAt: item.passwordResetTokenExpiresAt
||
null
,
provider: item.provider
||
null
,
importHash: item.importHash || null,
createdById: currentUser.id,
updatedById: currentUser.id,
createdAt: new Date(Date.now() + index * 1000),
}));
// Bulk create items
const users = await db.users.bulkCreate(usersData, { transaction });
// For each item created, replace relation files
for (let i = 0; i < users.length; i++) {
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.users.getTableName(),
belongsToColumn: 'avatar',
belongsToId: users[i].id,
},
data[i].avatar,
options,
);
}
return users;
}
static async update(id, data, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findByPk(id, {}, {transaction});
if (!data?.app_role) {
data.app_role = users?.app_role?.id;
}
if (!data?.custom_permissions) {
data.custom_permissions = users?.custom_permissions?.map(item => item.id);
}
if (data.password) {
data.password = bcrypt.hashSync(
data.password,
config.bcrypt.saltRounds,
);
} else {
data.password = users.password;
}
const updatePayload = {};
if (data.firstName !== undefined) updatePayload.firstName = data.firstName;
if (data.lastName !== undefined) updatePayload.lastName = data.lastName;
if (data.phoneNumber !== undefined) updatePayload.phoneNumber = data.phoneNumber;
if (data.email !== undefined) updatePayload.email = data.email;
if (data.disabled !== undefined) updatePayload.disabled = data.disabled;
if (data.password !== undefined) updatePayload.password = data.password;
if (data.emailVerified !== undefined) updatePayload.emailVerified = data.emailVerified;
else updatePayload.emailVerified = true;
if (data.emailVerificationToken !== undefined) updatePayload.emailVerificationToken = data.emailVerificationToken;
if (data.emailVerificationTokenExpiresAt !== undefined) updatePayload.emailVerificationTokenExpiresAt = data.emailVerificationTokenExpiresAt;
if (data.passwordResetToken !== undefined) updatePayload.passwordResetToken = data.passwordResetToken;
if (data.passwordResetTokenExpiresAt !== undefined) updatePayload.passwordResetTokenExpiresAt = data.passwordResetTokenExpiresAt;
if (data.provider !== undefined) updatePayload.provider = data.provider;
updatePayload.updatedById = currentUser.id;
await users.update(updatePayload, {transaction});
if (data.app_role !== undefined) {
await users.setApp_role(
data.app_role,
{ transaction }
);
}
if (data.custom_permissions !== undefined) {
await users.setCustom_permissions(data.custom_permissions, { transaction });
}
await FileDBApi.replaceRelationFiles(
{
belongsTo: db.users.getTableName(),
belongsToColumn: 'avatar',
belongsToId: users.id,
},
data.avatar,
options,
);
return users;
}
static async deleteByIds(ids, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findAll({
where: {
id: {
[Op.in]: ids,
},
},
transaction,
});
await db.sequelize.transaction(async (transaction) => {
for (const record of users) {
await record.update(
{deletedBy: currentUser.id},
{transaction}
);
}
for (const record of users) {
await record.destroy({transaction});
}
});
return users;
}
static async remove(id, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findByPk(id, options);
await users.update({
deletedBy: currentUser.id
}, {
transaction,
});
await users.destroy({
transaction
});
return users;
}
static async findBy(where, options) {
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findOne(
{ where },
{ transaction },
);
if (!users) {
return users;
}
const output = users.get({plain: true});
output.staff_members_user = await users.getStaff_members_user({
transaction
});
output.students_user = await users.getStudents_user({
transaction
});
output.messages_sender = await users.getMessages_sender({
transaction
});
output.messages_recipient = await users.getMessages_recipient({
transaction
});
output.avatar = await users.getAvatar({
transaction
});
output.app_role = await users.getApp_role({
transaction
});
if (output.app_role) {
output.app_role_permissions = await output.app_role.getPermissions({
transaction,
});
}
output.custom_permissions = await users.getCustom_permissions({
transaction
});
return output;
}
static async findAll(
filter,
options
) {
const limit = filter.limit || 0;
let offset = 0;
let where = {};
const currentPage = +filter.page;
offset = currentPage * limit;
const orderBy = null;
const transaction = (options && options.transaction) || undefined;
let include = [
{
model: db.roles,
as: 'app_role',
where: filter.app_role ? {
[Op.or]: [
{ id: { [Op.in]: filter.app_role.split('|').map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: filter.app_role.split('|').map(term => ({ [Op.iLike]: `%${term}%` }))
}
},
]
} : {},
},
{
model: db.permissions,
as: 'custom_permissions',
required: false,
},
{
model: db.file,
as: 'avatar',
},
];
if (filter) {
if (filter.id) {
where = {
...where,
['id']: Utils.uuid(filter.id),
};
}
if (filter.firstName) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'firstName',
filter.firstName,
),
};
}
if (filter.lastName) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'lastName',
filter.lastName,
),
};
}
if (filter.phoneNumber) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'phoneNumber',
filter.phoneNumber,
),
};
}
if (filter.email) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'email',
filter.email,
),
};
}
if (filter.password) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'password',
filter.password,
),
};
}
if (filter.emailVerificationToken) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'emailVerificationToken',
filter.emailVerificationToken,
),
};
}
if (filter.passwordResetToken) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'passwordResetToken',
filter.passwordResetToken,
),
};
}
if (filter.provider) {
where = {
...where,
[Op.and]: Utils.ilike(
'users',
'provider',
filter.provider,
),
};
}
if (filter.emailVerificationTokenExpiresAtRange) {
const [start, end] = filter.emailVerificationTokenExpiresAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
emailVerificationTokenExpiresAt: {
...where.emailVerificationTokenExpiresAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
emailVerificationTokenExpiresAt: {
...where.emailVerificationTokenExpiresAt,
[Op.lte]: end,
},
};
}
}
if (filter.passwordResetTokenExpiresAtRange) {
const [start, end] = filter.passwordResetTokenExpiresAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
passwordResetTokenExpiresAt: {
...where.passwordResetTokenExpiresAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
passwordResetTokenExpiresAt: {
...where.passwordResetTokenExpiresAt,
[Op.lte]: end,
},
};
}
}
if (filter.active !== undefined) {
where = {
...where,
active: filter.active === true || filter.active === 'true'
};
}
if (filter.disabled) {
where = {
...where,
disabled: filter.disabled,
};
}
if (filter.emailVerified) {
where = {
...where,
emailVerified: filter.emailVerified,
};
}
if (filter.custom_permissions) {
const searchTerms = filter.custom_permissions.split('|');
include = [
{
model: db.permissions,
as: 'custom_permissions_filter',
required: searchTerms.length > 0,
where: searchTerms.length > 0 ? {
[Op.or]: [
{ id: { [Op.in]: searchTerms.map(term => Utils.uuid(term)) } },
{
name: {
[Op.or]: searchTerms.map(term => ({ [Op.iLike]: `%${term}%` }))
}
}
]
} : undefined
},
...include,
]
}
if (filter.createdAtRange) {
const [start, end] = filter.createdAtRange;
if (start !== undefined && start !== null && start !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.gte]: start,
},
};
}
if (end !== undefined && end !== null && end !== '') {
where = {
...where,
['createdAt']: {
...where.createdAt,
[Op.lte]: end,
},
};
}
}
}
const queryOptions = {
where,
include,
distinct: true,
order: filter.field && filter.sort
? [[filter.field, filter.sort]]
: [['createdAt', 'desc']],
transaction: options?.transaction,
logging: console.log
};
if (!options?.countOnly) {
queryOptions.limit = limit ? Number(limit) : undefined;
queryOptions.offset = offset ? Number(offset) : undefined;
}
try {
const { rows, count } = await db.users.findAndCountAll(queryOptions);
return {
rows: options?.countOnly ? [] : rows,
count: count
};
} catch (error) {
console.error('Error executing query:', error);
throw error;
}
}
static async findAllAutocomplete(query, limit, offset, ) {
let where = {};
if (query) {
where = {
[Op.or]: [
{ ['id']: Utils.uuid(query) },
Utils.ilike(
'users',
'firstName',
query,
),
],
};
}
const records = await db.users.findAll({
attributes: [ 'id', 'firstName' ],
where,
limit: limit ? Number(limit) : undefined,
offset: offset ? Number(offset) : undefined,
orderBy: [['firstName', 'ASC']],
});
return records.map((record) => ({
id: record.id,
label: record.firstName,
}));
}
static async createFromAuth(data, options) {
const transaction = (options && options.transaction) || undefined;
const users = await db.users.create(
{
email: data.email,
firstName: data.firstName,
authenticationUid: data.authenticationUid,
password: data.password,
},
{ transaction },
);
const app_role = await db.roles.findOne({
where: { name: config.roles?.user || "User" },
});
if (app_role?.id) {
await users.setApp_role(app_role?.id || null, {
transaction,
});
}
await users.update(
{
authenticationUid: users.id,
},
{ transaction },
);
delete users.password;
return users;
}
static async updatePassword(id, password, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findByPk(id, {
transaction,
});
await users.update(
{
password,
authenticationUid: id,
updatedById: currentUser.id,
},
{ transaction },
);
return users;
}
static async generateEmailVerificationToken(email, options) {
return this._generateToken(['emailVerificationToken', 'emailVerificationTokenExpiresAt'], email, options);
}
static async generatePasswordResetToken(email, options) {
return this._generateToken(['passwordResetToken', 'passwordResetTokenExpiresAt'], email, options);
}
static async findByPasswordResetToken(token, options) {
const transaction = (options && options.transaction) || undefined;
return db.users.findOne(
{
where: {
passwordResetToken: token,
passwordResetTokenExpiresAt: {
[db.Sequelize.Op.gt]: Date.now(),
},
},
},
{ transaction },
);
}
static async findByEmailVerificationToken(
token,
options,
) {
const transaction = (options && options.transaction) || undefined;
return db.users.findOne(
{
where: {
emailVerificationToken: token,
emailVerificationTokenExpiresAt: {
[db.Sequelize.Op.gt]: Date.now(),
},
},
},
{ transaction },
);
}
static async markEmailVerified(id, options) {
const currentUser = (options && options.currentUser) || { id: null };
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findByPk(id, {
transaction,
});
await users.update(
{
emailVerified: true,
updatedById: currentUser.id,
},
{ transaction },
);
return true;
}
static async _generateToken(keyNames, email, options) {
const currentUser = (options && options.currentUser) || {id: null};
const transaction = (options && options.transaction) || undefined;
const users = await db.users.findOne(
{
where: { email: email.toLowerCase() },
},
{
transaction,
},
);
const token = crypto
.randomBytes(20)
.toString('hex');
const tokenExpiresAt = Date.now() + 360000;
if(users){
await users.update(
{
[keyNames[0]]: token,
[keyNames[1]]: tokenExpiresAt,
updatedById: currentUser.id,
},
{transaction},
);
}
return token;
}
};

View File

@ -0,0 +1,33 @@
module.exports = {
production: {
dialect: 'postgres',
username: process.env.DB_USER,
password: process.env.DB_PASS,
database: process.env.DB_NAME,
host: process.env.DB_HOST,
port: process.env.DB_PORT,
logging: console.log,
seederStorage: 'sequelize',
},
development: {
username: 'postgres',
dialect: 'postgres',
password: '',
database: 'db_limmu_gennet_school_website',
host: process.env.DB_HOST || 'localhost',
logging: console.log,
seederStorage: 'sequelize',
},
dev_stage: {
dialect: 'postgres',
username: process.env.DB_USER,
password: process.env.DB_PASS,
database: process.env.DB_NAME,
host: process.env.DB_HOST,
port: process.env.DB_PORT,
logging: console.log,
seederStorage: 'sequelize',
}
};

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,232 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const admission_applications = sequelize.define(
'admission_applications',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
entry_type: {
type: DataTypes.ENUM,
values: [
"grade_9_entry",
"transfer"
],
},
applicant_full_name: {
type: DataTypes.TEXT,
},
gender: {
type: DataTypes.ENUM,
values: [
"female",
"male"
],
},
date_of_birth: {
type: DataTypes.DATE,
},
previous_school: {
type: DataTypes.TEXT,
},
guardian_full_name: {
type: DataTypes.TEXT,
},
guardian_phone: {
type: DataTypes.TEXT,
},
guardian_email: {
type: DataTypes.TEXT,
},
address_text: {
type: DataTypes.TEXT,
},
submitted_at: {
type: DataTypes.DATE,
},
status: {
type: DataTypes.ENUM,
values: [
"submitted",
"under_review",
"accepted",
"rejected",
"waitlisted"
],
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
admission_applications.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.admission_applications.belongsTo(db.grades, {
as: 'requested_grade',
foreignKey: {
name: 'requested_gradeId',
},
constraints: false,
});
db.admission_applications.belongsTo(db.streams, {
as: 'requested_stream',
foreignKey: {
name: 'requested_streamId',
},
constraints: false,
});
db.admission_applications.hasMany(db.file, {
as: 'documents',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.admission_applications.getTableName(),
belongsToColumn: 'documents',
},
});
db.admission_applications.belongsTo(db.users, {
as: 'createdBy',
});
db.admission_applications.belongsTo(db.users, {
as: 'updatedBy',
});
};
return admission_applications;
};

View File

@ -0,0 +1,172 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const alumni = sequelize.define(
'alumni',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
full_name: {
type: DataTypes.TEXT,
},
graduation_year: {
type: DataTypes.TEXT,
},
current_role: {
type: DataTypes.TEXT,
},
organization: {
type: DataTypes.TEXT,
},
bio: {
type: DataTypes.TEXT,
},
phone_number: {
type: DataTypes.TEXT,
},
email: {
type: DataTypes.TEXT,
},
notable: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
published: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
alumni.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.alumni.hasMany(db.file, {
as: 'photo',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.alumni.getTableName(),
belongsToColumn: 'photo',
},
});
db.alumni.belongsTo(db.users, {
as: 'createdBy',
});
db.alumni.belongsTo(db.users, {
as: 'updatedBy',
});
};
return alumni;
};

View File

@ -0,0 +1,198 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const announcements = sequelize.define(
'announcements',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
title_om: {
type: DataTypes.TEXT,
},
title_am: {
type: DataTypes.TEXT,
},
title_en: {
type: DataTypes.TEXT,
},
content_om: {
type: DataTypes.TEXT,
},
content_am: {
type: DataTypes.TEXT,
},
content_en: {
type: DataTypes.TEXT,
},
visibility: {
type: DataTypes.ENUM,
values: [
"public",
"staff_only",
"students_only"
],
},
publish_from: {
type: DataTypes.DATE,
},
publish_until: {
type: DataTypes.DATE,
},
pinned: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
announcements.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.announcements.hasMany(db.file, {
as: 'images',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.announcements.getTableName(),
belongsToColumn: 'images',
},
});
db.announcements.hasMany(db.file, {
as: 'attachments',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.announcements.getTableName(),
belongsToColumn: 'attachments',
},
});
db.announcements.belongsTo(db.users, {
as: 'createdBy',
});
db.announcements.belongsTo(db.users, {
as: 'updatedBy',
});
};
return announcements;
};

View File

@ -0,0 +1,166 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const assignment_submissions = sequelize.define(
'assignment_submissions',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
comment: {
type: DataTypes.TEXT,
},
submitted_at: {
type: DataTypes.DATE,
},
status: {
type: DataTypes.ENUM,
values: [
"submitted",
"reviewed",
"returned"
],
},
grade_value: {
type: DataTypes.DECIMAL,
},
teacher_feedback: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
assignment_submissions.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.assignment_submissions.belongsTo(db.assignments, {
as: 'assignment',
foreignKey: {
name: 'assignmentId',
},
constraints: false,
});
db.assignment_submissions.belongsTo(db.students, {
as: 'student',
foreignKey: {
name: 'studentId',
},
constraints: false,
});
db.assignment_submissions.hasMany(db.file, {
as: 'files',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.assignment_submissions.getTableName(),
belongsToColumn: 'files',
},
});
db.assignment_submissions.belongsTo(db.users, {
as: 'createdBy',
});
db.assignment_submissions.belongsTo(db.users, {
as: 'updatedBy',
});
};
return assignment_submissions;
};

View File

@ -0,0 +1,179 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const assignments = sequelize.define(
'assignments',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
title: {
type: DataTypes.TEXT,
},
instructions: {
type: DataTypes.TEXT,
},
assigned_at: {
type: DataTypes.DATE,
},
due_at: {
type: DataTypes.DATE,
},
visibility: {
type: DataTypes.ENUM,
values: [
"students_only",
"public"
],
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
assignments.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.assignments.hasMany(db.assignment_submissions, {
as: 'assignment_submissions_assignment',
foreignKey: {
name: 'assignmentId',
},
constraints: false,
});
//end loop
db.assignments.belongsTo(db.class_sections, {
as: 'class_section',
foreignKey: {
name: 'class_sectionId',
},
constraints: false,
});
db.assignments.belongsTo(db.subjects, {
as: 'subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.assignments.belongsTo(db.staff_members, {
as: 'teacher',
foreignKey: {
name: 'teacherId',
},
constraints: false,
});
db.assignments.hasMany(db.file, {
as: 'attachments',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.assignments.getTableName(),
belongsToColumn: 'attachments',
},
});
db.assignments.belongsTo(db.users, {
as: 'createdBy',
});
db.assignments.belongsTo(db.users, {
as: 'updatedBy',
});
};
return assignments;
};

View File

@ -0,0 +1,138 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const attendance_records = sequelize.define(
'attendance_records',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
status: {
type: DataTypes.ENUM,
values: [
"present",
"absent",
"late",
"excused"
],
},
remark: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
attendance_records.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.attendance_records.belongsTo(db.attendance_sessions, {
as: 'attendance_session',
foreignKey: {
name: 'attendance_sessionId',
},
constraints: false,
});
db.attendance_records.belongsTo(db.students, {
as: 'student',
foreignKey: {
name: 'studentId',
},
constraints: false,
});
db.attendance_records.belongsTo(db.users, {
as: 'createdBy',
});
db.attendance_records.belongsTo(db.users, {
as: 'updatedBy',
});
};
return attendance_records;
};

View File

@ -0,0 +1,158 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const attendance_sessions = sequelize.define(
'attendance_sessions',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
session_at: {
type: DataTypes.DATE,
},
session_type: {
type: DataTypes.ENUM,
values: [
"period",
"exam",
"homeroom"
],
},
note: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
attendance_sessions.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.attendance_sessions.hasMany(db.attendance_records, {
as: 'attendance_records_attendance_session',
foreignKey: {
name: 'attendance_sessionId',
},
constraints: false,
});
//end loop
db.attendance_sessions.belongsTo(db.class_sections, {
as: 'class_section',
foreignKey: {
name: 'class_sectionId',
},
constraints: false,
});
db.attendance_sessions.belongsTo(db.subjects, {
as: 'subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.attendance_sessions.belongsTo(db.staff_members, {
as: 'teacher',
foreignKey: {
name: 'teacherId',
},
constraints: false,
});
db.attendance_sessions.belongsTo(db.users, {
as: 'createdBy',
});
db.attendance_sessions.belongsTo(db.users, {
as: 'updatedBy',
});
};
return attendance_sessions;
};

View File

@ -0,0 +1,173 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const class_schedules = sequelize.define(
'class_schedules',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
weekday: {
type: DataTypes.ENUM,
values: [
"monday",
"tuesday",
"wednesday",
"thursday",
"friday",
"saturday"
],
},
period_label: {
type: DataTypes.TEXT,
},
starts_at: {
type: DataTypes.DATE,
},
ends_at: {
type: DataTypes.DATE,
},
room: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
class_schedules.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.class_schedules.belongsTo(db.class_sections, {
as: 'class_section',
foreignKey: {
name: 'class_sectionId',
},
constraints: false,
});
db.class_schedules.belongsTo(db.subjects, {
as: 'subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.class_schedules.belongsTo(db.staff_members, {
as: 'teacher',
foreignKey: {
name: 'teacherId',
},
constraints: false,
});
db.class_schedules.belongsTo(db.users, {
as: 'createdBy',
});
db.class_schedules.belongsTo(db.users, {
as: 'updatedBy',
});
};
return class_schedules;
};

View File

@ -0,0 +1,179 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const class_sections = sequelize.define(
'class_sections',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
capacity: {
type: DataTypes.INTEGER,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
class_sections.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.class_sections.hasMany(db.enrollments, {
as: 'enrollments_class_section',
foreignKey: {
name: 'class_sectionId',
},
constraints: false,
});
db.class_sections.hasMany(db.attendance_sessions, {
as: 'attendance_sessions_class_section',
foreignKey: {
name: 'class_sectionId',
},
constraints: false,
});
db.class_sections.hasMany(db.class_schedules, {
as: 'class_schedules_class_section',
foreignKey: {
name: 'class_sectionId',
},
constraints: false,
});
db.class_sections.hasMany(db.assignments, {
as: 'assignments_class_section',
foreignKey: {
name: 'class_sectionId',
},
constraints: false,
});
db.class_sections.hasMany(db.lesson_plans, {
as: 'lesson_plans_class_section',
foreignKey: {
name: 'class_sectionId',
},
constraints: false,
});
//end loop
db.class_sections.belongsTo(db.school_years, {
as: 'school_year',
foreignKey: {
name: 'school_yearId',
},
constraints: false,
});
db.class_sections.belongsTo(db.grades, {
as: 'grade',
foreignKey: {
name: 'gradeId',
},
constraints: false,
});
db.class_sections.belongsTo(db.streams, {
as: 'stream',
foreignKey: {
name: 'streamId',
},
constraints: false,
});
db.class_sections.belongsTo(db.staff_members, {
as: 'homeroom_teacher',
foreignKey: {
name: 'homeroom_teacherId',
},
constraints: false,
});
db.class_sections.belongsTo(db.users, {
as: 'createdBy',
});
db.class_sections.belongsTo(db.users, {
as: 'updatedBy',
});
};
return class_sections;
};

View File

@ -0,0 +1,172 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const contact_messages = sequelize.define(
'contact_messages',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
full_name: {
type: DataTypes.TEXT,
},
phone_number: {
type: DataTypes.TEXT,
},
email: {
type: DataTypes.TEXT,
},
topic: {
type: DataTypes.ENUM,
values: [
"general",
"admissions",
"academics",
"results",
"technical"
],
},
message: {
type: DataTypes.TEXT,
},
submitted_at: {
type: DataTypes.DATE,
},
status: {
type: DataTypes.ENUM,
values: [
"new",
"in_progress",
"resolved"
],
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
contact_messages.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.contact_messages.belongsTo(db.users, {
as: 'createdBy',
});
db.contact_messages.belongsTo(db.users, {
as: 'updatedBy',
});
};
return contact_messages;
};

View File

@ -0,0 +1,138 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const enrollments = sequelize.define(
'enrollments',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
enrolled_at: {
type: DataTypes.DATE,
},
status: {
type: DataTypes.ENUM,
values: [
"active",
"transferred",
"graduated",
"dropped"
],
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
enrollments.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.enrollments.belongsTo(db.students, {
as: 'student',
foreignKey: {
name: 'studentId',
},
constraints: false,
});
db.enrollments.belongsTo(db.class_sections, {
as: 'class_section',
foreignKey: {
name: 'class_sectionId',
},
constraints: false,
});
db.enrollments.belongsTo(db.users, {
as: 'createdBy',
});
db.enrollments.belongsTo(db.users, {
as: 'updatedBy',
});
};
return enrollments;
};

View File

@ -0,0 +1,195 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const events = sequelize.define(
'events',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
title_om: {
type: DataTypes.TEXT,
},
title_am: {
type: DataTypes.TEXT,
},
title_en: {
type: DataTypes.TEXT,
},
description_om: {
type: DataTypes.TEXT,
},
description_am: {
type: DataTypes.TEXT,
},
description_en: {
type: DataTypes.TEXT,
},
starts_at: {
type: DataTypes.DATE,
},
ends_at: {
type: DataTypes.DATE,
},
location_text: {
type: DataTypes.TEXT,
},
visibility: {
type: DataTypes.ENUM,
values: [
"public",
"staff_only",
"students_only"
],
},
published: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
events.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.events.hasMany(db.file, {
as: 'cover_image',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.events.getTableName(),
belongsToColumn: 'cover_image',
},
});
db.events.belongsTo(db.users, {
as: 'createdBy',
});
db.events.belongsTo(db.users, {
as: 'updatedBy',
});
};
return events;
};

View File

@ -0,0 +1,146 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const exam_performance_summaries = sequelize.define(
'exam_performance_summaries',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
average_score: {
type: DataTypes.DECIMAL,
},
highest_score: {
type: DataTypes.DECIMAL,
},
lowest_score: {
type: DataTypes.DECIMAL,
},
pass_rate: {
type: DataTypes.DECIMAL,
},
student_count: {
type: DataTypes.INTEGER,
},
published: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
exam_performance_summaries.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.exam_performance_summaries.belongsTo(db.exams, {
as: 'exam',
foreignKey: {
name: 'examId',
},
constraints: false,
});
db.exam_performance_summaries.belongsTo(db.users, {
as: 'createdBy',
});
db.exam_performance_summaries.belongsTo(db.users, {
as: 'updatedBy',
});
};
return exam_performance_summaries;
};

View File

@ -0,0 +1,168 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const exam_results = sequelize.define(
'exam_results',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
score: {
type: DataTypes.DECIMAL,
},
out_of: {
type: DataTypes.DECIMAL,
},
result_status: {
type: DataTypes.ENUM,
values: [
"draft",
"submitted",
"approved",
"published"
],
},
entered_at: {
type: DataTypes.DATE,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
exam_results.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.exam_results.belongsTo(db.exams, {
as: 'exam',
foreignKey: {
name: 'examId',
},
constraints: false,
});
db.exam_results.belongsTo(db.students, {
as: 'student',
foreignKey: {
name: 'studentId',
},
constraints: false,
});
db.exam_results.belongsTo(db.subjects, {
as: 'subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.exam_results.belongsTo(db.staff_members, {
as: 'entered_by',
foreignKey: {
name: 'entered_byId',
},
constraints: false,
});
db.exam_results.belongsTo(db.users, {
as: 'createdBy',
});
db.exam_results.belongsTo(db.users, {
as: 'updatedBy',
});
};
return exam_results;
};

View File

@ -0,0 +1,208 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const exams = sequelize.define(
'exams',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
exam_type: {
type: DataTypes.ENUM,
values: [
"monthly",
"midterm",
"final",
"national_grade_8",
"national_grade_10",
"national_grade_12"
],
},
starts_at: {
type: DataTypes.DATE,
},
ends_at: {
type: DataTypes.DATE,
},
public_results: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
exams.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.exams.hasMany(db.exam_results, {
as: 'exam_results_exam',
foreignKey: {
name: 'examId',
},
constraints: false,
});
db.exams.hasMany(db.exam_performance_summaries, {
as: 'exam_performance_summaries_exam',
foreignKey: {
name: 'examId',
},
constraints: false,
});
db.exams.hasMany(db.top_student_features, {
as: 'top_student_features_exam',
foreignKey: {
name: 'examId',
},
constraints: false,
});
//end loop
db.exams.belongsTo(db.school_years, {
as: 'school_year',
foreignKey: {
name: 'school_yearId',
},
constraints: false,
});
db.exams.belongsTo(db.terms, {
as: 'term',
foreignKey: {
name: 'termId',
},
constraints: false,
});
db.exams.belongsTo(db.grades, {
as: 'grade',
foreignKey: {
name: 'gradeId',
},
constraints: false,
});
db.exams.belongsTo(db.streams, {
as: 'stream',
foreignKey: {
name: 'streamId',
},
constraints: false,
});
db.exams.belongsTo(db.users, {
as: 'createdBy',
});
db.exams.belongsTo(db.users, {
as: 'updatedBy',
});
};
return exams;
};

View File

@ -0,0 +1,53 @@
module.exports = function(sequelize, DataTypes) {
const file = sequelize.define(
'file',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
belongsTo: DataTypes.STRING(255),
belongsToId: DataTypes.UUID,
belongsToColumn: DataTypes.STRING(255),
name: {
type: DataTypes.STRING(2083),
allowNull: false,
validate: {
notEmpty: true,
},
},
sizeInBytes: {
type: DataTypes.INTEGER,
allowNull: true,
},
privateUrl: {
type: DataTypes.STRING(2083),
allowNull: true,
},
publicUrl: {
type: DataTypes.STRING(2083),
allowNull: false,
validate: {
notEmpty: true,
},
},
},
{
timestamps: true,
paranoid: true,
},
);
file.associate = (db) => {
db.file.belongsTo(db.users, {
as: 'createdBy',
});
db.file.belongsTo(db.users, {
as: 'updatedBy',
});
};
return file;
};

View File

@ -0,0 +1,162 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const galleries = sequelize.define(
'galleries',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
title_om: {
type: DataTypes.TEXT,
},
title_am: {
type: DataTypes.TEXT,
},
title_en: {
type: DataTypes.TEXT,
},
description_om: {
type: DataTypes.TEXT,
},
description_am: {
type: DataTypes.TEXT,
},
description_en: {
type: DataTypes.TEXT,
},
published: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
published_at: {
type: DataTypes.DATE,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
galleries.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.galleries.hasMany(db.file, {
as: 'images',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.galleries.getTableName(),
belongsToColumn: 'images',
},
});
db.galleries.belongsTo(db.users, {
as: 'createdBy',
});
db.galleries.belongsTo(db.users, {
as: 'updatedBy',
});
};
return galleries;
};

View File

@ -0,0 +1,171 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const grades = sequelize.define(
'grades',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
grade_number: {
type: DataTypes.INTEGER,
},
level: {
type: DataTypes.ENUM,
values: [
"high_school",
"preparatory"
],
},
label: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
grades.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.grades.hasMany(db.subject_offerings, {
as: 'subject_offerings_grade',
foreignKey: {
name: 'gradeId',
},
constraints: false,
});
db.grades.hasMany(db.students, {
as: 'students_current_grade',
foreignKey: {
name: 'current_gradeId',
},
constraints: false,
});
db.grades.hasMany(db.class_sections, {
as: 'class_sections_grade',
foreignKey: {
name: 'gradeId',
},
constraints: false,
});
db.grades.hasMany(db.study_materials, {
as: 'study_materials_grade',
foreignKey: {
name: 'gradeId',
},
constraints: false,
});
db.grades.hasMany(db.exams, {
as: 'exams_grade',
foreignKey: {
name: 'gradeId',
},
constraints: false,
});
db.grades.hasMany(db.admission_applications, {
as: 'admission_applications_requested_grade',
foreignKey: {
name: 'requested_gradeId',
},
constraints: false,
});
//end loop
db.grades.belongsTo(db.users, {
as: 'createdBy',
});
db.grades.belongsTo(db.users, {
as: 'updatedBy',
});
};
return grades;
};

View File

@ -0,0 +1,38 @@
'use strict';
const fs = require('fs');
const path = require('path');
const Sequelize = require('sequelize');
const basename = path.basename(__filename);
const env = process.env.NODE_ENV || 'development';
const config = require("../db.config")[env];
const db = {};
let sequelize;
console.log(env);
if (config.use_env_variable) {
sequelize = new Sequelize(process.env[config.use_env_variable], config);
} else {
sequelize = new Sequelize(config.database, config.username, config.password, config);
}
fs
.readdirSync(__dirname)
.filter(file => {
return (file.indexOf('.') !== 0) && (file !== basename) && (file.slice(-3) === '.js');
})
.forEach(file => {
const model = require(path.join(__dirname, file))(sequelize, Sequelize.DataTypes)
db[model.name] = model;
});
Object.keys(db).forEach(modelName => {
if (db[modelName].associate) {
db[modelName].associate(db);
}
});
db.sequelize = sequelize;
db.Sequelize = Sequelize;
module.exports = db;

View File

@ -0,0 +1,167 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const lesson_plans = sequelize.define(
'lesson_plans',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
week_start_at: {
type: DataTypes.DATE,
},
week_end_at: {
type: DataTypes.DATE,
},
plan_content: {
type: DataTypes.TEXT,
},
status: {
type: DataTypes.ENUM,
values: [
"draft",
"submitted",
"approved"
],
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
lesson_plans.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.lesson_plans.belongsTo(db.staff_members, {
as: 'teacher',
foreignKey: {
name: 'teacherId',
},
constraints: false,
});
db.lesson_plans.belongsTo(db.class_sections, {
as: 'class_section',
foreignKey: {
name: 'class_sectionId',
},
constraints: false,
});
db.lesson_plans.belongsTo(db.subjects, {
as: 'subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.lesson_plans.hasMany(db.file, {
as: 'attachments',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.lesson_plans.getTableName(),
belongsToColumn: 'attachments',
},
});
db.lesson_plans.belongsTo(db.users, {
as: 'createdBy',
});
db.lesson_plans.belongsTo(db.users, {
as: 'updatedBy',
});
};
return lesson_plans;
};

View File

@ -0,0 +1,156 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const messages = sequelize.define(
'messages',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
subject: {
type: DataTypes.TEXT,
},
body: {
type: DataTypes.TEXT,
},
status: {
type: DataTypes.ENUM,
values: [
"sent",
"read",
"archived"
],
},
sent_at: {
type: DataTypes.DATE,
},
read_at: {
type: DataTypes.DATE,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
messages.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.messages.belongsTo(db.users, {
as: 'sender',
foreignKey: {
name: 'senderId',
},
constraints: false,
});
db.messages.belongsTo(db.users, {
as: 'recipient',
foreignKey: {
name: 'recipientId',
},
constraints: false,
});
db.messages.belongsTo(db.users, {
as: 'createdBy',
});
db.messages.belongsTo(db.users, {
as: 'updatedBy',
});
};
return messages;
};

View File

@ -0,0 +1,200 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const news_posts = sequelize.define(
'news_posts',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
title_om: {
type: DataTypes.TEXT,
},
title_am: {
type: DataTypes.TEXT,
},
title_en: {
type: DataTypes.TEXT,
},
excerpt_om: {
type: DataTypes.TEXT,
},
excerpt_am: {
type: DataTypes.TEXT,
},
excerpt_en: {
type: DataTypes.TEXT,
},
content_om: {
type: DataTypes.TEXT,
},
content_am: {
type: DataTypes.TEXT,
},
content_en: {
type: DataTypes.TEXT,
},
status: {
type: DataTypes.ENUM,
values: [
"draft",
"published",
"archived"
],
},
published_at: {
type: DataTypes.DATE,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
news_posts.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.news_posts.belongsTo(db.staff_members, {
as: 'author',
foreignKey: {
name: 'authorId',
},
constraints: false,
});
db.news_posts.hasMany(db.file, {
as: 'featured_image',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.news_posts.getTableName(),
belongsToColumn: 'featured_image',
},
});
db.news_posts.belongsTo(db.users, {
as: 'createdBy',
});
db.news_posts.belongsTo(db.users, {
as: 'updatedBy',
});
};
return news_posts;
};

View File

@ -0,0 +1,202 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const pages = sequelize.define(
'pages',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
page_type: {
type: DataTypes.ENUM,
values: [
"about_history",
"about_mission_vision_values",
"about_principal_message",
"about_facilities",
"about_achievements",
"parents_information",
"student_guide",
"admissions",
"contact",
"alumni"
],
},
title_om: {
type: DataTypes.TEXT,
},
title_am: {
type: DataTypes.TEXT,
},
title_en: {
type: DataTypes.TEXT,
},
content_om: {
type: DataTypes.TEXT,
},
content_am: {
type: DataTypes.TEXT,
},
content_en: {
type: DataTypes.TEXT,
},
published: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
published_at: {
type: DataTypes.DATE,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
pages.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.pages.hasMany(db.file, {
as: 'cover_image',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.pages.getTableName(),
belongsToColumn: 'cover_image',
},
});
db.pages.belongsTo(db.users, {
as: 'createdBy',
});
db.pages.belongsTo(db.users, {
as: 'updatedBy',
});
};
return pages;
};

View File

@ -0,0 +1,100 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const permissions = sequelize.define(
'permissions',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
permissions.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.permissions.belongsTo(db.users, {
as: 'createdBy',
});
db.permissions.belongsTo(db.users, {
as: 'updatedBy',
});
};
return permissions;
};

View File

@ -0,0 +1,133 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const roles = sequelize.define(
'roles',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
role_customization: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
roles.associate = (db) => {
db.roles.belongsToMany(db.permissions, {
as: 'permissions',
foreignKey: {
name: 'roles_permissionsId',
},
constraints: false,
through: 'rolesPermissionsPermissions',
});
db.roles.belongsToMany(db.permissions, {
as: 'permissions_filter',
foreignKey: {
name: 'roles_permissionsId',
},
constraints: false,
through: 'rolesPermissionsPermissions',
});
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.roles.hasMany(db.users, {
as: 'users_app_role',
foreignKey: {
name: 'app_roleId',
},
constraints: false,
});
//end loop
db.roles.belongsTo(db.users, {
as: 'createdBy',
});
db.roles.belongsTo(db.users, {
as: 'updatedBy',
});
};
return roles;
};

View File

@ -0,0 +1,190 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const school_settings = sequelize.define(
'school_settings',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
school_name_om: {
type: DataTypes.TEXT,
},
school_name_am: {
type: DataTypes.TEXT,
},
school_name_en: {
type: DataTypes.TEXT,
},
welcome_message_om: {
type: DataTypes.TEXT,
},
welcome_message_am: {
type: DataTypes.TEXT,
},
welcome_message_en: {
type: DataTypes.TEXT,
},
address_text: {
type: DataTypes.TEXT,
},
map_embed_url: {
type: DataTypes.TEXT,
},
public_phone_numbers: {
type: DataTypes.TEXT,
},
public_emails: {
type: DataTypes.TEXT,
},
social_links: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
school_settings.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.school_settings.hasMany(db.file, {
as: 'logo',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.school_settings.getTableName(),
belongsToColumn: 'logo',
},
});
db.school_settings.hasMany(db.file, {
as: 'hero_images',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.school_settings.getTableName(),
belongsToColumn: 'hero_images',
},
});
db.school_settings.belongsTo(db.users, {
as: 'createdBy',
});
db.school_settings.belongsTo(db.users, {
as: 'updatedBy',
});
};
return school_settings;
};

View File

@ -0,0 +1,129 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const school_statistics = sequelize.define(
'school_statistics',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
student_count: {
type: DataTypes.INTEGER,
},
teacher_count: {
type: DataTypes.INTEGER,
},
pass_rate: {
type: DataTypes.DECIMAL,
},
notes: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
school_statistics.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.school_statistics.belongsTo(db.school_years, {
as: 'school_year',
foreignKey: {
name: 'school_yearId',
},
constraints: false,
});
db.school_statistics.belongsTo(db.users, {
as: 'createdBy',
});
db.school_statistics.belongsTo(db.users, {
as: 'updatedBy',
});
};
return school_statistics;
};

View File

@ -0,0 +1,156 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const school_years = sequelize.define(
'school_years',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name: {
type: DataTypes.TEXT,
},
starts_at: {
type: DataTypes.DATE,
},
ends_at: {
type: DataTypes.DATE,
},
current: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
school_years.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.school_years.hasMany(db.class_sections, {
as: 'class_sections_school_year',
foreignKey: {
name: 'school_yearId',
},
constraints: false,
});
db.school_years.hasMany(db.terms, {
as: 'terms_school_year',
foreignKey: {
name: 'school_yearId',
},
constraints: false,
});
db.school_years.hasMany(db.exams, {
as: 'exams_school_year',
foreignKey: {
name: 'school_yearId',
},
constraints: false,
});
db.school_years.hasMany(db.school_statistics, {
as: 'school_statistics_school_year',
foreignKey: {
name: 'school_yearId',
},
constraints: false,
});
//end loop
db.school_years.belongsTo(db.users, {
as: 'createdBy',
});
db.school_years.belongsTo(db.users, {
as: 'updatedBy',
});
};
return school_years;
};

View File

@ -0,0 +1,270 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const staff_members = sequelize.define(
'staff_members',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
full_name: {
type: DataTypes.TEXT,
},
staff_role: {
type: DataTypes.ENUM,
values: [
"principal",
"vice_principal",
"teacher",
"registrar",
"librarian",
"admin_staff",
"it_support"
],
},
bio: {
type: DataTypes.TEXT,
},
phone_number: {
type: DataTypes.TEXT,
},
email: {
type: DataTypes.TEXT,
},
public_profile: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
staff_members.associate = (db) => {
db.staff_members.belongsToMany(db.subjects, {
as: 'subjects_taught',
foreignKey: {
name: 'staff_members_subjects_taughtId',
},
constraints: false,
through: 'staff_membersSubjects_taughtSubjects',
});
db.staff_members.belongsToMany(db.subjects, {
as: 'subjects_taught_filter',
foreignKey: {
name: 'staff_members_subjects_taughtId',
},
constraints: false,
through: 'staff_membersSubjects_taughtSubjects',
});
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.staff_members.hasMany(db.subject_offerings, {
as: 'subject_offerings_responsible_teacher',
foreignKey: {
name: 'responsible_teacherId',
},
constraints: false,
});
db.staff_members.hasMany(db.class_sections, {
as: 'class_sections_homeroom_teacher',
foreignKey: {
name: 'homeroom_teacherId',
},
constraints: false,
});
db.staff_members.hasMany(db.news_posts, {
as: 'news_posts_author',
foreignKey: {
name: 'authorId',
},
constraints: false,
});
db.staff_members.hasMany(db.study_materials, {
as: 'study_materials_uploaded_by',
foreignKey: {
name: 'uploaded_byId',
},
constraints: false,
});
db.staff_members.hasMany(db.exam_results, {
as: 'exam_results_entered_by',
foreignKey: {
name: 'entered_byId',
},
constraints: false,
});
db.staff_members.hasMany(db.attendance_sessions, {
as: 'attendance_sessions_teacher',
foreignKey: {
name: 'teacherId',
},
constraints: false,
});
db.staff_members.hasMany(db.class_schedules, {
as: 'class_schedules_teacher',
foreignKey: {
name: 'teacherId',
},
constraints: false,
});
db.staff_members.hasMany(db.assignments, {
as: 'assignments_teacher',
foreignKey: {
name: 'teacherId',
},
constraints: false,
});
db.staff_members.hasMany(db.lesson_plans, {
as: 'lesson_plans_teacher',
foreignKey: {
name: 'teacherId',
},
constraints: false,
});
//end loop
db.staff_members.belongsTo(db.users, {
as: 'user',
foreignKey: {
name: 'userId',
},
constraints: false,
});
db.staff_members.hasMany(db.file, {
as: 'photo',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.staff_members.getTableName(),
belongsToColumn: 'photo',
},
});
db.staff_members.belongsTo(db.users, {
as: 'createdBy',
});
db.staff_members.belongsTo(db.users, {
as: 'updatedBy',
});
};
return staff_members;
};

View File

@ -0,0 +1,181 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const streams = sequelize.define(
'streams',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name_om: {
type: DataTypes.TEXT,
},
name_am: {
type: DataTypes.TEXT,
},
name_en: {
type: DataTypes.TEXT,
},
stream_type: {
type: DataTypes.ENUM,
values: [
"natural",
"social",
"general"
],
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
streams.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.streams.hasMany(db.subject_offerings, {
as: 'subject_offerings_stream',
foreignKey: {
name: 'streamId',
},
constraints: false,
});
db.streams.hasMany(db.students, {
as: 'students_current_stream',
foreignKey: {
name: 'current_streamId',
},
constraints: false,
});
db.streams.hasMany(db.class_sections, {
as: 'class_sections_stream',
foreignKey: {
name: 'streamId',
},
constraints: false,
});
db.streams.hasMany(db.study_materials, {
as: 'study_materials_stream',
foreignKey: {
name: 'streamId',
},
constraints: false,
});
db.streams.hasMany(db.exams, {
as: 'exams_stream',
foreignKey: {
name: 'streamId',
},
constraints: false,
});
db.streams.hasMany(db.admission_applications, {
as: 'admission_applications_requested_stream',
foreignKey: {
name: 'requested_streamId',
},
constraints: false,
});
//end loop
db.streams.belongsTo(db.users, {
as: 'createdBy',
});
db.streams.belongsTo(db.users, {
as: 'updatedBy',
});
};
return streams;
};

View File

@ -0,0 +1,225 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const students = sequelize.define(
'students',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
full_name: {
type: DataTypes.TEXT,
},
student_code: {
type: DataTypes.TEXT,
},
gender: {
type: DataTypes.ENUM,
values: [
"female",
"male"
],
},
date_of_birth: {
type: DataTypes.DATE,
},
guardian_name: {
type: DataTypes.TEXT,
},
guardian_phone: {
type: DataTypes.TEXT,
},
address_text: {
type: DataTypes.TEXT,
},
active: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
students.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.students.hasMany(db.enrollments, {
as: 'enrollments_student',
foreignKey: {
name: 'studentId',
},
constraints: false,
});
db.students.hasMany(db.exam_results, {
as: 'exam_results_student',
foreignKey: {
name: 'studentId',
},
constraints: false,
});
db.students.hasMany(db.attendance_records, {
as: 'attendance_records_student',
foreignKey: {
name: 'studentId',
},
constraints: false,
});
db.students.hasMany(db.assignment_submissions, {
as: 'assignment_submissions_student',
foreignKey: {
name: 'studentId',
},
constraints: false,
});
db.students.hasMany(db.top_student_features, {
as: 'top_student_features_student',
foreignKey: {
name: 'studentId',
},
constraints: false,
});
//end loop
db.students.belongsTo(db.users, {
as: 'user',
foreignKey: {
name: 'userId',
},
constraints: false,
});
db.students.belongsTo(db.grades, {
as: 'current_grade',
foreignKey: {
name: 'current_gradeId',
},
constraints: false,
});
db.students.belongsTo(db.streams, {
as: 'current_stream',
foreignKey: {
name: 'current_streamId',
},
constraints: false,
});
db.students.belongsTo(db.users, {
as: 'createdBy',
});
db.students.belongsTo(db.users, {
as: 'updatedBy',
});
};
return students;
};

View File

@ -0,0 +1,221 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const study_materials = sequelize.define(
'study_materials',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
title_om: {
type: DataTypes.TEXT,
},
title_am: {
type: DataTypes.TEXT,
},
title_en: {
type: DataTypes.TEXT,
},
material_type: {
type: DataTypes.ENUM,
values: [
"past_exam_paper",
"revision_notes",
"practice_questions",
"reference_list",
"video_link",
"other"
],
},
description: {
type: DataTypes.TEXT,
},
external_url: {
type: DataTypes.TEXT,
},
visibility: {
type: DataTypes.ENUM,
values: [
"public",
"students_only"
],
},
published_at: {
type: DataTypes.DATE,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
study_materials.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.study_materials.belongsTo(db.grades, {
as: 'grade',
foreignKey: {
name: 'gradeId',
},
constraints: false,
});
db.study_materials.belongsTo(db.subjects, {
as: 'subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.study_materials.belongsTo(db.streams, {
as: 'stream',
foreignKey: {
name: 'streamId',
},
constraints: false,
});
db.study_materials.belongsTo(db.staff_members, {
as: 'uploaded_by',
foreignKey: {
name: 'uploaded_byId',
},
constraints: false,
});
db.study_materials.hasMany(db.file, {
as: 'file',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.study_materials.getTableName(),
belongsToColumn: 'file',
},
});
db.study_materials.belongsTo(db.users, {
as: 'createdBy',
});
db.study_materials.belongsTo(db.users, {
as: 'updatedBy',
});
};
return study_materials;
};

View File

@ -0,0 +1,153 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const subject_offerings = sequelize.define(
'subject_offerings',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
learning_objectives_om: {
type: DataTypes.TEXT,
},
learning_objectives_am: {
type: DataTypes.TEXT,
},
learning_objectives_en: {
type: DataTypes.TEXT,
},
topics_outline: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
subject_offerings.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.subject_offerings.belongsTo(db.subjects, {
as: 'subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.subject_offerings.belongsTo(db.grades, {
as: 'grade',
foreignKey: {
name: 'gradeId',
},
constraints: false,
});
db.subject_offerings.belongsTo(db.streams, {
as: 'stream',
foreignKey: {
name: 'streamId',
},
constraints: false,
});
db.subject_offerings.belongsTo(db.staff_members, {
as: 'responsible_teacher',
foreignKey: {
name: 'responsible_teacherId',
},
constraints: false,
});
db.subject_offerings.belongsTo(db.users, {
as: 'createdBy',
});
db.subject_offerings.belongsTo(db.users, {
as: 'updatedBy',
});
};
return subject_offerings;
};

View File

@ -0,0 +1,208 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const subjects = sequelize.define(
'subjects',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
name_om: {
type: DataTypes.TEXT,
},
name_am: {
type: DataTypes.TEXT,
},
name_en: {
type: DataTypes.TEXT,
},
overview_om: {
type: DataTypes.TEXT,
},
overview_am: {
type: DataTypes.TEXT,
},
overview_en: {
type: DataTypes.TEXT,
},
textbook_references: {
type: DataTypes.TEXT,
},
active: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
subjects.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.subjects.hasMany(db.subject_offerings, {
as: 'subject_offerings_subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.subjects.hasMany(db.study_materials, {
as: 'study_materials_subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.subjects.hasMany(db.exam_results, {
as: 'exam_results_subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.subjects.hasMany(db.attendance_sessions, {
as: 'attendance_sessions_subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.subjects.hasMany(db.class_schedules, {
as: 'class_schedules_subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.subjects.hasMany(db.assignments, {
as: 'assignments_subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
db.subjects.hasMany(db.lesson_plans, {
as: 'lesson_plans_subject',
foreignKey: {
name: 'subjectId',
},
constraints: false,
});
//end loop
db.subjects.belongsTo(db.users, {
as: 'createdBy',
});
db.subjects.belongsTo(db.users, {
as: 'updatedBy',
});
};
return subjects;
};

View File

@ -0,0 +1,142 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const terms = sequelize.define(
'terms',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
term_name: {
type: DataTypes.ENUM,
values: [
"term_1",
"term_2",
"term_3"
],
},
starts_at: {
type: DataTypes.DATE,
},
ends_at: {
type: DataTypes.DATE,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
terms.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.terms.hasMany(db.exams, {
as: 'exams_term',
foreignKey: {
name: 'termId',
},
constraints: false,
});
//end loop
db.terms.belongsTo(db.school_years, {
as: 'school_year',
foreignKey: {
name: 'school_yearId',
},
constraints: false,
});
db.terms.belongsTo(db.users, {
as: 'createdBy',
});
db.terms.belongsTo(db.users, {
as: 'updatedBy',
});
};
return terms;
};

View File

@ -0,0 +1,150 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const top_student_features = sequelize.define(
'top_student_features',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
total_score: {
type: DataTypes.DECIMAL,
},
rank: {
type: DataTypes.INTEGER,
},
consent_to_publish: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
published: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
note: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
top_student_features.associate = (db) => {
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
//end loop
db.top_student_features.belongsTo(db.exams, {
as: 'exam',
foreignKey: {
name: 'examId',
},
constraints: false,
});
db.top_student_features.belongsTo(db.students, {
as: 'student',
foreignKey: {
name: 'studentId',
},
constraints: false,
});
db.top_student_features.belongsTo(db.users, {
as: 'createdBy',
});
db.top_student_features.belongsTo(db.users, {
as: 'updatedBy',
});
};
return top_student_features;
};

View File

@ -0,0 +1,290 @@
const config = require('../../config');
const providers = config.providers;
const crypto = require('crypto');
const bcrypt = require('bcrypt');
const moment = require('moment');
module.exports = function(sequelize, DataTypes) {
const users = sequelize.define(
'users',
{
id: {
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4,
primaryKey: true,
},
firstName: {
type: DataTypes.TEXT,
},
lastName: {
type: DataTypes.TEXT,
},
phoneNumber: {
type: DataTypes.TEXT,
},
email: {
type: DataTypes.TEXT,
},
disabled: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
password: {
type: DataTypes.TEXT,
},
emailVerified: {
type: DataTypes.BOOLEAN,
allowNull: false,
defaultValue: false,
},
emailVerificationToken: {
type: DataTypes.TEXT,
},
emailVerificationTokenExpiresAt: {
type: DataTypes.DATE,
},
passwordResetToken: {
type: DataTypes.TEXT,
},
passwordResetTokenExpiresAt: {
type: DataTypes.DATE,
},
provider: {
type: DataTypes.TEXT,
},
importHash: {
type: DataTypes.STRING(255),
allowNull: true,
unique: true,
},
},
{
timestamps: true,
paranoid: true,
freezeTableName: true,
},
);
users.associate = (db) => {
db.users.belongsToMany(db.permissions, {
as: 'custom_permissions',
foreignKey: {
name: 'users_custom_permissionsId',
},
constraints: false,
through: 'usersCustom_permissionsPermissions',
});
db.users.belongsToMany(db.permissions, {
as: 'custom_permissions_filter',
foreignKey: {
name: 'users_custom_permissionsId',
},
constraints: false,
through: 'usersCustom_permissionsPermissions',
});
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
db.users.hasMany(db.staff_members, {
as: 'staff_members_user',
foreignKey: {
name: 'userId',
},
constraints: false,
});
db.users.hasMany(db.students, {
as: 'students_user',
foreignKey: {
name: 'userId',
},
constraints: false,
});
db.users.hasMany(db.messages, {
as: 'messages_sender',
foreignKey: {
name: 'senderId',
},
constraints: false,
});
db.users.hasMany(db.messages, {
as: 'messages_recipient',
foreignKey: {
name: 'recipientId',
},
constraints: false,
});
//end loop
db.users.belongsTo(db.roles, {
as: 'app_role',
foreignKey: {
name: 'app_roleId',
},
constraints: false,
});
db.users.hasMany(db.file, {
as: 'avatar',
foreignKey: 'belongsToId',
constraints: false,
scope: {
belongsTo: db.users.getTableName(),
belongsToColumn: 'avatar',
},
});
db.users.belongsTo(db.users, {
as: 'createdBy',
});
db.users.belongsTo(db.users, {
as: 'updatedBy',
});
};
users.beforeCreate((users, options) => {
users = trimStringFields(users);
if (users.provider !== providers.LOCAL && Object.values(providers).indexOf(users.provider) > -1) {
users.emailVerified = true;
if (!users.password) {
const password = crypto
.randomBytes(20)
.toString('hex');
const hashedPassword = bcrypt.hashSync(
password,
config.bcrypt.saltRounds,
);
users.password = hashedPassword
}
}
});
users.beforeUpdate((users, options) => {
users = trimStringFields(users);
});
return users;
};
function trimStringFields(users) {
users.email = users.email.trim();
users.firstName = users.firstName
? users.firstName.trim()
: null;
users.lastName = users.lastName
? users.lastName.trim()
: null;
return users;
}

16
backend/src/db/reset.js Normal file
View File

@ -0,0 +1,16 @@
const db = require('./models');
const {execSync} = require("child_process");
console.log('Resetting Database');
db.sequelize
.sync({ force: true })
.then(() => {
execSync("sequelize db:seed:all");
console.log('OK');
process.exit();
})
.catch((error) => {
console.error(error);
process.exit(1);
});

View File

@ -0,0 +1,66 @@
'use strict';
const bcrypt = require("bcrypt");
const config = require("../../config");
const ids = [
'193bf4b5-9f07-4bd5-9a43-e7e41f3e96af',
'af5a87be-8f9c-4630-902a-37a60b7005ba',
'5bc531ab-611f-41f3-9373-b7cc5d09c93d',
]
module.exports = {
up: async (queryInterface, Sequelize) => {
let admin_hash = bcrypt.hashSync(config.admin_pass, config.bcrypt.saltRounds);
let user_hash = bcrypt.hashSync(config.user_pass, config.bcrypt.saltRounds);
try {
await queryInterface.bulkInsert('users', [
{
id: ids[0],
firstName: 'Admin',
email: config.admin_email,
emailVerified: true,
provider: config.providers.LOCAL,
password: admin_hash,
createdAt: new Date(),
updatedAt: new Date()
},
{
id: ids[1],
firstName: 'John',
email: 'john@doe.com',
emailVerified: true,
provider: config.providers.LOCAL,
password: user_hash,
createdAt: new Date(),
updatedAt: new Date()
},
{
id: ids[2],
firstName: 'Client',
email: 'client@hello.com',
emailVerified: true,
provider: config.providers.LOCAL,
password: user_hash,
createdAt: new Date(),
updatedAt: new Date()
},
]);
} catch (error) {
console.error('Error during bulkInsert:', error);
throw error;
}
},
down: async (queryInterface, Sequelize) => {
try {
await queryInterface.bulkDelete('users', {
id: {
[Sequelize.Op.in]: ids,
},
}, {});
} catch (error) {
console.error('Error during bulkDelete:', error);
throw error;
}
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

27
backend/src/db/utils.js Normal file
View File

@ -0,0 +1,27 @@
const validator = require('validator');
const { v4: uuid } = require('uuid');
const Sequelize = require('./models').Sequelize;
module.exports = class Utils {
static uuid(value) {
let id = value;
if (!validator.isUUID(id)) {
id = uuid();
}
return id;
}
static ilike(model, column, value) {
return Sequelize.where(
Sequelize.fn(
'lower',
Sequelize.col(`${model}.${column}`),
),
{
[Sequelize.Op.like]: `%${value}%`.toLowerCase(),
},
);
}
};

23
backend/src/helpers.js Normal file
View File

@ -0,0 +1,23 @@
const jwt = require('jsonwebtoken');
const config = require('./config');
module.exports = class Helpers {
static wrapAsync(fn) {
return function (req, res, next) {
fn(req, res, next).catch(next);
};
}
static commonErrorHandler(error, req, res, next) {
if ([400, 403, 404].includes(error.code)) {
return res.status(error.code).send(error.message);
}
console.error(error);
return res.status(500).send(error.message);
}
static jwtSign(data) {
return jwt.sign(data, config.secret_key, {expiresIn: '6h'});
};
};

Some files were not shown because too many files have changed in this diff Show More