Initial version
This commit is contained in:
commit
8d80776c8c
305
.cursorrules
Normal file
305
.cursorrules
Normal file
@ -0,0 +1,305 @@
|
||||
# Cursor Rules - Group 1: Development Philosophy & Coding Conventions
|
||||
1. Overall Architecture & Structure:
|
||||
- Enforce a clear separation of concerns between the backend and the frontend:
|
||||
- **Backend**: Use Express for routing, Passport for authentication, and Swagger for API documentation. Organize code into modules such as routes, services, and helpers.
|
||||
- **Example**:
|
||||
- Routes: `src/routes/auth.js` for authentication routes.
|
||||
- Services: `src/services/auth.js` for authentication logic.
|
||||
- Helpers: `src/helpers/wrapAsync.js` for wrapping asynchronous functions.
|
||||
- **Frontend**: Use Next.js with React and TypeScript. Structure components using functional components, hooks, and layouts.
|
||||
- **Example**:
|
||||
- Pages: `pages/index.tsx` for the main page.
|
||||
- Components: `components/Header.tsx` for the header component.
|
||||
- Layouts: `layouts/MainLayout.tsx` for common page layouts.
|
||||
- Ensure that backend modules and frontend components are organized for reusability and maintainability:
|
||||
- **Backend**: Separate business logic into services and use middleware for common tasks.
|
||||
- **Frontend**: Use reusable components and hooks to manage state and lifecycle.
|
||||
|
||||
2. Coding Style & Formatting:
|
||||
- For the backend (JavaScript):
|
||||
• Use ES6+ features (const/let, arrow functions) consistently.
|
||||
• Follow Prettier and ESLint configurations (e.g., consistent 2-space indentation, semicolons, and single quotes).
|
||||
• Maintain clear asynchronous patterns with helper wrappers (e.g., wrapAsync).
|
||||
- **Example from auth.js**:
|
||||
```javascript
|
||||
router.post('/signin/local', wrapAsync(async (req, res) => {
|
||||
const payload = await AuthService.signin(req.body.email, req.body.password, req);
|
||||
res.status(200).send(payload);
|
||||
}));
|
||||
```
|
||||
• Document API endpoints with inline Swagger comments to ensure API clarity and consistency.
|
||||
- **Example**:
|
||||
```javascript
|
||||
/**
|
||||
* @swagger
|
||||
* /api/auth/signin:
|
||||
* post:
|
||||
* summary: Sign in a user
|
||||
* responses:
|
||||
* 200:
|
||||
* description: Successful login
|
||||
*/
|
||||
```
|
||||
- For the frontend (TypeScript/React):
|
||||
• Use functional components with strict typing and separation of concerns.
|
||||
- **Example**:
|
||||
```typescript
|
||||
const Button: React.FC<{ onClick: () => void }> = ({ onClick }) => (
|
||||
<button onClick={onClick}>Click me</button>
|
||||
);
|
||||
```
|
||||
• Follow naming conventions: PascalCase for components and types/interfaces, camelCase for variables, hooks, and function names.
|
||||
- **Example**:
|
||||
```typescript
|
||||
const useCustomHook = () => {
|
||||
const [state, setState] = useState(false);
|
||||
return [state, setState];
|
||||
};
|
||||
```
|
||||
• Utilize hooks (useEffect, useState) to manage state and lifecycle in a clear and concise manner.
|
||||
- **Example**:
|
||||
```typescript
|
||||
useEffect(() => {
|
||||
console.log('Component mounted');
|
||||
}, []);
|
||||
```
|
||||
|
||||
3. Code Quality & Best Practices:
|
||||
- Ensure code modularity by splitting complex logic into smaller, testable units.
|
||||
- **Example**: In `auth.js`, routes are separated from business logic, which is handled in `AuthService`.
|
||||
- Write self-documenting code and add comments where the logic is non-trivial.
|
||||
- **Example**: Use descriptive function and variable names in `auth.js`, and add comments for complex asynchronous operations.
|
||||
- Embrace declarative programming and adhere to SOLID principles.
|
||||
- **Example**: In service functions, ensure each function has a single responsibility and dependencies are injected rather than hardcoded.
|
||||
|
||||
4. Consistency & Tools Integration:
|
||||
- Leverage existing tools like Prettier and ESLint to automatically enforce style and formatting rules.
|
||||
- **Example**: Use `.prettierrc` and `.eslintrc.cjs` for configuration in your project.
|
||||
- Use TypeScript in the frontend to ensure type safety and catch errors early.
|
||||
- **Example**: Define interfaces and types in your React components to enforce strict typing.
|
||||
- Maintain uniformity in API design and error handling strategies.
|
||||
- **Example**: Consistently use Passport for authentication and a common error handling middleware in `auth.js`.
|
||||
|
||||
## Group 2 – Naming Conventions
|
||||
1. File Naming and Structure:
|
||||
• Frontend:
|
||||
- Page Files: Use lower-case filenames (e.g., index.tsx) as prescribed by Next.js conventions.
|
||||
- **Example**: `pages/index.tsx`, `pages/about.tsx`
|
||||
- Component Files: Use PascalCase for React component files (e.g., WebSiteHeader.tsx, NavBar.tsx).
|
||||
- **Example**: `components/Header.tsx`, `components/Footer.tsx`
|
||||
- Directories: Use clear, descriptive names (e.g., 'pages', 'components', 'WebPageComponents').
|
||||
- **Example**: `src/pages`, `src/components`
|
||||
• Backend:
|
||||
- Use lower-case filenames for modules (e.g., index.js, auth.js, projects.js).
|
||||
- **Example**: `routes/auth.js`, `services/user.js`
|
||||
- When needed, use hyphenation for clarity, but maintain consistency.
|
||||
- **Example**: `helpers/wrap-async.js`
|
||||
|
||||
2. Component and Module Naming:
|
||||
• Frontend:
|
||||
- React Components: Define components in PascalCase.
|
||||
- TypeScript Interfaces/Types: Use PascalCase (e.g., WebSiteHeaderProps).
|
||||
• Backend:
|
||||
- Classes (if any) and constructors should be in PascalCase; most helper functions and modules use camelCase.
|
||||
|
||||
3. Variable, Function, and Hook Naming:
|
||||
• Use camelCase for variables and function names in both frontend and backend.
|
||||
- **Example**:
|
||||
```javascript
|
||||
const userName = 'John Doe';
|
||||
function handleLogin() { ... }
|
||||
```
|
||||
• Custom Hooks: Prefix with 'use' (e.g., useAuth, useForm).
|
||||
- **Example**:
|
||||
```typescript
|
||||
const useAuth = () => {
|
||||
const [isAuthenticated, setIsAuthenticated] = useState(false);
|
||||
return { isAuthenticated, setIsAuthenticated };
|
||||
};
|
||||
```
|
||||
|
||||
4. Consistency and Readability:
|
||||
• Maintain uniform naming across the project to ensure clarity and ease of maintenance.
|
||||
- **Example**: Use consistent naming conventions for variables, functions, and components, such as camelCase for variables and functions, and PascalCase for components.
|
||||
- **Example**: In `auth.js`, ensure that all function names clearly describe their purpose, such as `handleLogin` or `validateUserInput`.
|
||||
|
||||
## Group 3 – Frontend & React Best Practices
|
||||
1. Use of Functional Components & TypeScript:
|
||||
• Build all components as functional components.
|
||||
- **Example**:
|
||||
```typescript
|
||||
const Header: React.FC = () => {
|
||||
return <header>Header Content</header>;
|
||||
};
|
||||
```
|
||||
• Leverage TypeScript for static type checking and enforce strict prop and state types.
|
||||
- **Example**:
|
||||
```typescript
|
||||
interface ButtonProps {
|
||||
onClick: () => void;
|
||||
}
|
||||
const Button: React.FC<ButtonProps> = ({ onClick }) => (
|
||||
<button onClick={onClick}>Click me</button>
|
||||
);
|
||||
```
|
||||
|
||||
2. Effective Use of React Hooks:
|
||||
• Utilize useState and useEffect appropriately with proper dependency arrays.
|
||||
- **Example**:
|
||||
```typescript
|
||||
const [count, setCount] = useState(0);
|
||||
useEffect(() => {
|
||||
console.log('Component mounted');
|
||||
}, []);
|
||||
```
|
||||
• Create custom hooks to encapsulate shared logic (e.g., useAppSelector).
|
||||
- **Example**:
|
||||
```typescript
|
||||
const useAuth = () => {
|
||||
const [isAuthenticated, setIsAuthenticated] = useState(false);
|
||||
return { isAuthenticated, setIsAuthenticated };
|
||||
};
|
||||
```
|
||||
|
||||
3. Component Composition & Separation of Concerns:
|
||||
• Separate presentational (stateless) components from container components managing logic.
|
||||
- **Example**: Use `LayoutGuest` to encapsulate common page structures.
|
||||
|
||||
4. Code Quality & Readability:
|
||||
• Maintain consistent formatting and adhere to Prettier and ESLint rules.
|
||||
• Use descriptive names for variables, functions, and components.
|
||||
• Document non-trivial logic with inline comments and consider implementing error boundaries where needed.
|
||||
• New code must adhere to these conventions to avoid ambiguity.
|
||||
• Use descriptive names that reflect the purpose and domain, avoiding abbreviations unless standard in the project.
|
||||
|
||||
## Group 4 – Backend & API Guidelines
|
||||
1. API Endpoint Design & Documentation:
|
||||
• Follow RESTful naming conventions; all route handlers should be named clearly and consistently.
|
||||
- **Example**: Use verbs like `GET`, `POST`, `PUT`, `DELETE` to define actions, e.g., `GET /api/auth/me` to retrieve user info.
|
||||
• Document endpoints with Swagger annotations to provide descriptions, expected request bodies, and response codes.
|
||||
- **Example**:
|
||||
```javascript
|
||||
/**
|
||||
* @swagger
|
||||
* /api/auth/signin:
|
||||
* post:
|
||||
* summary: Sign in a user
|
||||
* requestBody:
|
||||
* description: User credentials
|
||||
* content:
|
||||
* application/json:
|
||||
* schema:
|
||||
* $ref: "#/components/schemas/Auth"
|
||||
* responses:
|
||||
* 200:
|
||||
* description: Successful login
|
||||
* 400:
|
||||
* description: Invalid username/password supplied
|
||||
*/
|
||||
```
|
||||
• Examples (for Auth endpoints):
|
||||
- POST /api/auth/signin/local
|
||||
• Description: Logs the user into the system.
|
||||
• Request Body (application/json):
|
||||
{ "email": "admin@flatlogic.com", "password": "password" }
|
||||
• Responses:
|
||||
- 200: Successful login (returns token and user data).
|
||||
- 400: Invalid username/password supplied.
|
||||
- GET /api/auth/me
|
||||
• Description: Retrieves current authorized user information.
|
||||
• Secured via Passport JWT; uses req.currentUser.
|
||||
• Responses:
|
||||
- 200: Returns current user info.
|
||||
- 400: Invalid credentials or missing user data.
|
||||
- POST /api/auth/signup
|
||||
• Description: Registers a new user.
|
||||
• Request Body (application/json):
|
||||
{ "email": "admin@flatlogic.com", "password": "password" }
|
||||
• Responses:
|
||||
- 200: New user signed up successfully.
|
||||
- 400: Invalid input supplied.
|
||||
- 500: Server error.
|
||||
|
||||
## Group 5 – Testing, Quality Assurance & Error Handling
|
||||
1. Testing Guidelines:
|
||||
• Write unit tests for critical backend and frontend components using frameworks such as Jest, React Testing Library, and Mocha/Chai.
|
||||
- **Example**:
|
||||
```javascript
|
||||
test('should return user data', async () => {
|
||||
const user = await getUserData();
|
||||
expect(user).toHaveProperty('email');
|
||||
});
|
||||
```
|
||||
• Practice test-driven development and maintain high test coverage.
|
||||
• Regularly update tests following changes in business logic.
|
||||
|
||||
2. Quality Assurance:
|
||||
• Enforce code quality with ESLint, Prettier, and static analysis tools.
|
||||
• Integrate continuous testing workflows (CI/CD) to catch issues early.
|
||||
- **Example**: Use GitHub Actions for automated testing and deployment.
|
||||
• Ensure documentation is kept up-to-date with the implemented code.
|
||||
|
||||
3. Error Handling:
|
||||
• Back-end:
|
||||
- Wrap asynchronous route handlers with a helper (e.g., wrapAsync) to capture errors.
|
||||
- **Example**:
|
||||
```javascript
|
||||
router.post('/signin', wrapAsync(async (req, res) => {
|
||||
const user = await AuthService.signin(req.body);
|
||||
res.send(user);
|
||||
}));
|
||||
```
|
||||
- Use centralized error handling middleware (e.g., commonErrorHandler) for uniform error responses.
|
||||
• Front-end:
|
||||
- Implement error boundaries in React to gracefully handle runtime errors.
|
||||
- Display user-friendly error messages and log errors for further analysis.
|
||||
|
||||
2. Authentication & Security:
|
||||
• Protect endpoints by using Passport.js with JWT (e.g., passport.authenticate('jwt', { session: false })).
|
||||
- **Example**:
|
||||
```javascript
|
||||
router.get('/profile', passport.authenticate('jwt', { session: false }), (req, res) => {
|
||||
res.send(req.user);
|
||||
});
|
||||
```
|
||||
• Ensure that secure routes check for existence of req.currentUser. If absent, return a ForbiddenError.
|
||||
|
||||
3. Consistent Error Handling & Middleware Usage:
|
||||
• Wrap asynchronous route handlers with helpers like wrapAsync for error propagation.
|
||||
• Use centralized error handling middleware (e.g., commonErrorHandler) to capture and format errors uniformly.
|
||||
|
||||
4. Modular Code Organization:
|
||||
• Organize backend code into separate files for routes, services, and database access (e.g., auth.js, projects.js, tasks.js).
|
||||
• Use descriptive, lowercase filenames for modules and routes.
|
||||
|
||||
5. Endpoint Security Best Practices:
|
||||
• Validate input data and sanitize requests where necessary.
|
||||
• Restrict sensitive operations to authenticated users with proper role-based permissions.
|
||||
|
||||
|
||||
────────────────────────────────────────
|
||||
Group 6 – Accessibility, UI, and Styling Guidelines (Updated)
|
||||
────────────────────────────────────────
|
||||
1. Sidebar Styling:
|
||||
• The sidebar is implemented in the authenticated layout via the AsideMenu component, with the actual element defined in AsideMenuLayer (located at frontend/src/components/AsideMenuLayer.tsx) as an <aside> element with id="asideMenu".
|
||||
- **Example**:
|
||||
```css
|
||||
#asideMenu {
|
||||
background-color: #F8F4E1 !important;
|
||||
}
|
||||
```
|
||||
• When modifying sidebar styles, target #asideMenu and its child elements rather than generic selectors (e.g., avoid .app-sidebar) to ensure that the changes affect the actual rendered sidebar.
|
||||
• Remove or override any conflicting background utilities (such as an unwanted bg-white) so our desired background color (#F8F4E1) is fully visible. Use a highly specific selector if necessary.
|
||||
• Adjust spacing (padding/margins) at both the container (#asideMenu) and the individual menu item level to maintain a consistent, compact design.
|
||||
|
||||
2. General Project Styling and Tailwind CSS Usage:
|
||||
• The application leverages Tailwind CSS extensively, with core styling defined in _theme.css using the @apply directive. Any new modifications should follow this pattern to ensure consistency.
|
||||
- **Example**:
|
||||
```css
|
||||
.btn {
|
||||
@apply bg-blue-500 text-white;
|
||||
}
|
||||
```
|
||||
• The themed blocks (like .theme-pink and .theme-green) standardize the UI's appearance. When applying custom overrides, ensure they integrate cleanly into these structures and avoid conflicts or circular dependency errors (e.g., issues when redefining utilities such as text-blue-600).
|
||||
• Adjustments via Tailwind CSS generally require modifying class names in the components and ensuring that global overrides are applied in the correct order. Consistent use of design tokens and custom color codes (e.g., #F8F4E1) throughout the app is crucial to a cohesive design.
|
||||
• Specificity is key. If a change isn't visually reflected as expected, inspect the rendered HTML to identify which classes are taking precedence.
|
||||
3
.dockerignore
Normal file
3
.dockerignore
Normal file
@ -0,0 +1,3 @@
|
||||
backend/node_modules
|
||||
frontend/node_modules
|
||||
frontend/build
|
||||
3
.gitignore
vendored
Normal file
3
.gitignore
vendored
Normal file
@ -0,0 +1,3 @@
|
||||
node_modules/
|
||||
*/node_modules/
|
||||
*/build/
|
||||
184
502.html
Normal file
184
502.html
Normal file
@ -0,0 +1,184 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Service Starting</title>
|
||||
<style>
|
||||
body {
|
||||
font-family: sans-serif;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
min-height: 100vh;
|
||||
background-color: #EFF2FF;
|
||||
margin: 0;
|
||||
padding: 20px;
|
||||
}
|
||||
|
||||
.container {
|
||||
text-align: center;
|
||||
padding: 30px 40px;
|
||||
background-color: #fff;
|
||||
border-radius: 20px;
|
||||
margin-bottom: 20px;
|
||||
max-width: 538px;
|
||||
width: 100%;
|
||||
box-shadow: 0 13px 34px 0 rgba(167, 187, 242, 0.2);
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
#status-heading {
|
||||
font-size: 24px;
|
||||
font-weight: 700;
|
||||
color: #02004E;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
h2 {
|
||||
color: #333;
|
||||
margin-bottom: 15px;
|
||||
}
|
||||
|
||||
p {
|
||||
color: #666;
|
||||
font-size: 1.1em;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.tip {
|
||||
font-weight: 300;
|
||||
font-size: 17px;
|
||||
line-height: 150%;
|
||||
letter-spacing: 0;
|
||||
text-align: center;
|
||||
margin-top: 30px;
|
||||
}
|
||||
|
||||
.loader-container {
|
||||
position: relative;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.loader {
|
||||
width: 100px;
|
||||
aspect-ratio: 1;
|
||||
border-radius: 50%;
|
||||
background:
|
||||
radial-gradient(farthest-side, #5C7EF1 94%, #0000) top/8px 8px no-repeat,
|
||||
conic-gradient(#0000 30%, #5C7EF1);
|
||||
-webkit-mask: radial-gradient(farthest-side, #0000 calc(100% - 8px), #000 0);
|
||||
animation: l13 2s infinite linear;
|
||||
}
|
||||
|
||||
@keyframes l13 {
|
||||
100% {
|
||||
transform: rotate(1turn)
|
||||
}
|
||||
}
|
||||
|
||||
.app-logo {
|
||||
position: absolute;
|
||||
width: 36px;
|
||||
}
|
||||
|
||||
.panel {
|
||||
padding: 0 18px;
|
||||
display: none;
|
||||
background-color: white;
|
||||
overflow: hidden;
|
||||
margin-top: 10px;
|
||||
}
|
||||
|
||||
.project-info {
|
||||
border: 1px solid #8C9DFF;
|
||||
border-radius: 10px;
|
||||
padding: 12px 16px;
|
||||
max-width: 600px;
|
||||
margin: 40px auto;
|
||||
background-color: #FBFCFF;
|
||||
}
|
||||
|
||||
.project-info h2 {
|
||||
color: #02004E;
|
||||
font-size: 14px;
|
||||
font-weight: 500;
|
||||
margin-bottom: 10px;
|
||||
text-align: left;
|
||||
}
|
||||
|
||||
.project-info p {
|
||||
color: #686791;
|
||||
font-size: 12px;
|
||||
font-weight: 400;
|
||||
text-align: left;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<div class="container">
|
||||
<h2 id="status-heading">Loading the app, just a moment…</h2>
|
||||
<p class="tip">The application is currently launching. The page will automatically refresh once site is
|
||||
available.</p>
|
||||
<div class="project-info">
|
||||
<h2>AI Agent Hub</h2>
|
||||
<p>Manage specialized AI agents with user roles.</p>
|
||||
</div>
|
||||
<div class="loader-container">
|
||||
<img src="https://flatlogic.com/blog/wp-content/uploads/2025/05/logo-bot-1.png" alt="App Logo"
|
||||
class="app-logo">
|
||||
<div class="loader"></div>
|
||||
</div>
|
||||
<div class="panel">
|
||||
<video width="560" height="315" controls loop>
|
||||
<source
|
||||
src="https://flatlogic.com/blog/wp-content/uploads/2025/04/20250430_1336_professional_dynamo_spinner_simple_compose_01jt349yvtenxt7xhg8hhr85j8.mp4"
|
||||
type="video/mp4">
|
||||
Your browser does not support the video tag.
|
||||
</video>
|
||||
</div>
|
||||
</div>
|
||||
<script>
|
||||
function checkAvailability() {
|
||||
fetch('/')
|
||||
.then(response => {
|
||||
if (response.ok) {
|
||||
window.location.reload();
|
||||
} else {
|
||||
setTimeout(checkAvailability, 5000);
|
||||
}
|
||||
})
|
||||
.catch(() => {
|
||||
setTimeout(checkAvailability, 5000);
|
||||
});
|
||||
}
|
||||
document.addEventListener('DOMContentLoaded', checkAvailability);
|
||||
|
||||
document.addEventListener('DOMContentLoaded', function () {
|
||||
const appTitle = document.querySelector('#status-heading');
|
||||
const panel = document.querySelector('.panel');
|
||||
const video = panel.querySelector('video');
|
||||
let clickCount = 0;
|
||||
|
||||
appTitle.addEventListener('click', function () {
|
||||
clickCount++;
|
||||
if (clickCount === 5) {
|
||||
panel.classList.toggle('show');
|
||||
if (panel.classList.contains('show')) {
|
||||
video.play();
|
||||
} else {
|
||||
video.pause();
|
||||
}
|
||||
clickCount = 0;
|
||||
}
|
||||
});
|
||||
});
|
||||
</script>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
17
Dockerfile
Normal file
17
Dockerfile
Normal file
@ -0,0 +1,17 @@
|
||||
FROM node:20.15.1-alpine AS builder
|
||||
RUN apk add --no-cache git
|
||||
WORKDIR /app
|
||||
COPY frontend/package.json frontend/yarn.lock ./
|
||||
RUN yarn install --pure-lockfile
|
||||
COPY frontend .
|
||||
RUN yarn build
|
||||
|
||||
FROM node:20.15.1-alpine
|
||||
WORKDIR /app
|
||||
COPY backend/package.json backend/yarn.lock ./
|
||||
RUN yarn install --pure-lockfile
|
||||
COPY backend .
|
||||
|
||||
COPY --from=builder /app/build /app/public
|
||||
CMD ["yarn", "start"]
|
||||
|
||||
73
Dockerfile.dev
Normal file
73
Dockerfile.dev
Normal file
@ -0,0 +1,73 @@
|
||||
# Base image for Node.js dependencies
|
||||
FROM node:20.15.1-alpine AS frontend-deps
|
||||
RUN apk add --no-cache git
|
||||
WORKDIR /app/frontend
|
||||
COPY frontend/package.json frontend/yarn.lock ./
|
||||
RUN yarn install --pure-lockfile
|
||||
|
||||
FROM node:20.15.1-alpine AS backend-deps
|
||||
RUN apk add --no-cache git
|
||||
WORKDIR /app/backend
|
||||
COPY backend/package.json backend/yarn.lock ./
|
||||
RUN yarn install --pure-lockfile
|
||||
|
||||
FROM node:20.15.1-alpine AS app-shell-deps
|
||||
RUN apk add --no-cache git
|
||||
WORKDIR /app/app-shell
|
||||
COPY app-shell/package.json app-shell/yarn.lock ./
|
||||
RUN yarn install --pure-lockfile
|
||||
|
||||
# Nginx setup and application build
|
||||
FROM node:20.15.1-alpine AS build
|
||||
RUN apk add --no-cache git nginx
|
||||
RUN apk add --no-cache lsof procps
|
||||
RUN yarn global add concurrently
|
||||
|
||||
RUN mkdir -p /app/pids
|
||||
|
||||
# Make sure to add yarn global bin to PATH
|
||||
ENV PATH /root/.yarn/bin:/root/.config/yarn/global/node_modules/.bin:$PATH
|
||||
|
||||
# Copy dependencies
|
||||
WORKDIR /app
|
||||
COPY --from=frontend-deps /app/frontend /app/frontend
|
||||
COPY --from=backend-deps /app/backend /app/backend
|
||||
COPY --from=app-shell-deps /app/app-shell /app/app-shell
|
||||
|
||||
COPY frontend /app/frontend
|
||||
COPY backend /app/backend
|
||||
COPY app-shell /app/app-shell
|
||||
COPY docker /app/docker
|
||||
|
||||
# Copy Nginx configuration
|
||||
COPY nginx.conf /etc/nginx/nginx.conf
|
||||
|
||||
# Copy custom error page
|
||||
COPY 502.html /usr/share/nginx/html/502.html
|
||||
|
||||
# Change owner and permissions of the error page
|
||||
RUN chown nginx:nginx /usr/share/nginx/html/502.html && \
|
||||
chmod 644 /usr/share/nginx/html/502.html
|
||||
|
||||
# Copy all files from root to /app
|
||||
COPY . /app
|
||||
|
||||
# Expose the port the app runs on
|
||||
EXPOSE 8080
|
||||
ENV NODE_ENV=dev_stage
|
||||
ENV FRONT_PORT=3001
|
||||
ENV BACKEND_PORT=3000
|
||||
ENV APP_SHELL_PORT=4000
|
||||
|
||||
CMD ["sh", "-c", "\
|
||||
yarn --cwd /app/frontend dev & echo $! > /app/pids/frontend.pid && \
|
||||
yarn --cwd /app/backend start & echo $! > /app/pids/backend.pid && \
|
||||
sleep 10 && nginx -g 'daemon off;' & \
|
||||
NGINX_PID=$! && \
|
||||
echo 'Waiting for backend (port 3000) to be available...' && \
|
||||
while ! nc -z localhost ${BACKEND_PORT}; do \
|
||||
sleep 2; \
|
||||
done && \
|
||||
echo 'Backend is up. Starting app_shell for Git check...' && \
|
||||
yarn --cwd /app/app-shell start && \
|
||||
wait $NGINX_PID"]
|
||||
200
README.md
Normal file
200
README.md
Normal file
@ -0,0 +1,200 @@
|
||||
|
||||
|
||||
# AI Agent Hub
|
||||
|
||||
## This project was generated by [Flatlogic Platform](https://flatlogic.com).
|
||||
|
||||
- Frontend: [React.js](https://flatlogic.com/templates?framework%5B%5D=react&sort=default)
|
||||
|
||||
- Backend: [NodeJS](https://flatlogic.com/templates?backend%5B%5D=nodejs&sort=default)
|
||||
|
||||
<details><summary>Backend Folder Structure</summary>
|
||||
|
||||
The generated application has the following backend folder structure:
|
||||
|
||||
`src` folder which contains your working files that will be used later to create the build. The src folder contains folders as:
|
||||
|
||||
- `auth` - config the library for authentication and authorization;
|
||||
|
||||
- `db` - contains such folders as:
|
||||
|
||||
- `api` - documentation that is automatically generated by jsdoc or other tools;
|
||||
|
||||
- `migrations` - is a skeleton of the database or all the actions that users do with the database;
|
||||
|
||||
- `models`- what will represent the database for the backend;
|
||||
|
||||
- `seeders` - the entity that creates the data for the database.
|
||||
|
||||
- `routes` - this folder would contain all the routes that you have created using Express Router and what they do would be exported from a Controller file;
|
||||
|
||||
- `services` - contains such folders as `emails` and `notifications`.
|
||||
</details>
|
||||
|
||||
- Database: PostgreSQL
|
||||
|
||||
- app-shel: Core application framework that provides essential infrastructure services
|
||||
for the entire application.
|
||||
-----------------------
|
||||
### We offer 2 ways how to start the project locally: by running Frontend and Backend or with Docker.
|
||||
-----------------------
|
||||
|
||||
## To start the project:
|
||||
|
||||
### Backend:
|
||||
|
||||
> Please change current folder: `cd backend`
|
||||
|
||||
#### Install local dependencies:
|
||||
`yarn install`
|
||||
|
||||
------------
|
||||
|
||||
#### Adjust local db:
|
||||
##### 1. Install postgres:
|
||||
|
||||
MacOS:
|
||||
|
||||
`brew install postgres`
|
||||
|
||||
> if you don’t have ‘brew‘ please install it (https://brew.sh) and repeat step `brew install postgres`.
|
||||
|
||||
Ubuntu:
|
||||
|
||||
`sudo apt update`
|
||||
|
||||
`sudo apt install postgresql postgresql-contrib`
|
||||
|
||||
##### 2. Create db and admin user:
|
||||
Before run and test connection, make sure you have created a database as described in the above configuration. You can use the `psql` command to create a user and database.
|
||||
|
||||
`psql postgres --u postgres`
|
||||
|
||||
Next, type this command for creating a new user with password then give access for creating the database.
|
||||
|
||||
`postgres-# CREATE ROLE admin WITH LOGIN PASSWORD 'admin_pass';`
|
||||
|
||||
`postgres-# ALTER ROLE admin CREATEDB;`
|
||||
|
||||
Quit `psql` then log in again using the new user that previously created.
|
||||
|
||||
`postgres-# \q`
|
||||
|
||||
`psql postgres -U admin`
|
||||
|
||||
Type this command to creating a new database.
|
||||
|
||||
`postgres=> CREATE DATABASE db_{your_project_name};`
|
||||
|
||||
Then give that new user privileges to the new database then quit the `psql`.
|
||||
|
||||
`postgres=> GRANT ALL PRIVILEGES ON DATABASE db_{your_project_name} TO admin;`
|
||||
|
||||
`postgres=> \q`
|
||||
|
||||
------------
|
||||
|
||||
#### Create database:
|
||||
`yarn db:create`
|
||||
|
||||
#### Start production build:
|
||||
`yarn start`
|
||||
|
||||
### Frontend:
|
||||
|
||||
> Please change current folder: `cd frontend`
|
||||
|
||||
## To start the project with Docker:
|
||||
### Description:
|
||||
|
||||
The project contains the **docker folder** and the `Dockerfile`.
|
||||
|
||||
The `Dockerfile` is used to Deploy the project to Google Cloud.
|
||||
|
||||
The **docker folder** contains a couple of helper scripts:
|
||||
|
||||
- `docker-compose.yml` (all our services: web, backend, db are described here)
|
||||
- `start-backend.sh` (starts backend, but only after the database)
|
||||
- `wait-for-it.sh` (imported from https://github.com/vishnubob/wait-for-it)
|
||||
|
||||
> To avoid breaking the application, we recommend you don't edit the following files: everything that includes the **docker folder** and `Dokerfile`.
|
||||
|
||||
## Run services:
|
||||
|
||||
1. Install docker compose (https://docs.docker.com/compose/install/)
|
||||
|
||||
2. Move to `docker` folder. All next steps should be done from this folder.
|
||||
|
||||
``` cd docker ```
|
||||
|
||||
3. Make executables from `wait-for-it.sh` and `start-backend.sh`:
|
||||
|
||||
``` chmod +x start-backend.sh && chmod +x wait-for-it.sh ```
|
||||
|
||||
4. Download dependend projects for services.
|
||||
|
||||
5. Review the docker-compose.yml file. Make sure that all services have Dockerfiles. Only db service doesn't require a Dockerfile.
|
||||
|
||||
6. Make sure you have needed ports (see them in `ports`) available on your local machine.
|
||||
|
||||
7. Start services:
|
||||
|
||||
7.1. With an empty database `rm -rf data && docker-compose up`
|
||||
|
||||
7.2. With a stored (from previus runs) database data `docker-compose up`
|
||||
|
||||
8. Check http://localhost:3000
|
||||
|
||||
9. Stop services:
|
||||
|
||||
9.1. Just press `Ctr+C`
|
||||
|
||||
## Most common errors:
|
||||
|
||||
1. `connection refused`
|
||||
|
||||
There could be many reasons, but the most common are:
|
||||
|
||||
- The port is not open on the destination machine.
|
||||
|
||||
- The port is open on the destination machine, but its backlog of pending connections is full.
|
||||
|
||||
- A firewall between the client and server is blocking access (also check local firewalls).
|
||||
|
||||
After checking for firewalls and that the port is open, use telnet to connect to the IP/port to test connectivity. This removes any potential issues from your application.
|
||||
|
||||
***MacOS:***
|
||||
|
||||
If you suspect that your SSH service might be down, you can run this command to find out:
|
||||
|
||||
`sudo service ssh status`
|
||||
|
||||
If the command line returns a status of down, then you’ve likely found the reason behind your connectivity error.
|
||||
|
||||
***Ubuntu:***
|
||||
|
||||
Sometimes a connection refused error can also indicate that there is an IP address conflict on your network. You can search for possible IP conflicts by running:
|
||||
|
||||
`arp-scan -I eth0 -l | grep <ipaddress>`
|
||||
|
||||
`arp-scan -I eth0 -l | grep <ipaddress>`
|
||||
|
||||
and
|
||||
|
||||
`arping <ipaddress>`
|
||||
|
||||
2. `yarn db:create` creates database with the assembled tables (on MacOS with Postgres database)
|
||||
|
||||
The workaround - put the next commands to your Postgres database terminal:
|
||||
|
||||
`DROP SCHEMA public CASCADE;`
|
||||
|
||||
`CREATE SCHEMA public;`
|
||||
|
||||
`GRANT ALL ON SCHEMA public TO postgres;`
|
||||
|
||||
`GRANT ALL ON SCHEMA public TO public;`
|
||||
|
||||
Afterwards, continue to start your project in the backend directory by running:
|
||||
|
||||
`yarn start`
|
||||
26
app-shell/.eslintrc.cjs
Normal file
26
app-shell/.eslintrc.cjs
Normal file
@ -0,0 +1,26 @@
|
||||
const globals = require('globals');
|
||||
|
||||
module.exports = [
|
||||
{
|
||||
files: ['**/*.js', '**/*.ts', '**/*.tsx'],
|
||||
languageOptions: {
|
||||
ecmaVersion: 2021,
|
||||
sourceType: 'module',
|
||||
globals: {
|
||||
...globals.browser,
|
||||
...globals.node,
|
||||
},
|
||||
parser: '@typescript-eslint/parser',
|
||||
},
|
||||
plugins: ['@typescript-eslint'],
|
||||
rules: {
|
||||
'no-unused-vars': 'warn',
|
||||
'no-console': 'off',
|
||||
'indent': ['error', 2],
|
||||
'quotes': ['error', 'single'],
|
||||
'semi': ['error', 'always'],
|
||||
|
||||
'@typescript-eslint/no-unused-vars': 'warn',
|
||||
},
|
||||
},
|
||||
];
|
||||
11
app-shell/.prettierrc
Normal file
11
app-shell/.prettierrc
Normal file
@ -0,0 +1,11 @@
|
||||
{
|
||||
"singleQuote": true,
|
||||
"tabWidth": 2,
|
||||
"printWidth": 80,
|
||||
"trailingComma": "all",
|
||||
"quoteProps": "as-needed",
|
||||
"jsxSingleQuote": true,
|
||||
"bracketSpacing": true,
|
||||
"bracketSameLine": false,
|
||||
"arrowParens": "always"
|
||||
}
|
||||
7
app-shell/.sequelizerc
Normal file
7
app-shell/.sequelizerc
Normal file
@ -0,0 +1,7 @@
|
||||
const path = require('path');
|
||||
module.exports = {
|
||||
"config": path.resolve("src", "db", "db.config.js"),
|
||||
"models-path": path.resolve("src", "db", "models"),
|
||||
"seeders-path": path.resolve("src", "db", "seeders"),
|
||||
"migrations-path": path.resolve("src", "db", "migrations")
|
||||
};
|
||||
23
app-shell/Dockerfile
Normal file
23
app-shell/Dockerfile
Normal file
@ -0,0 +1,23 @@
|
||||
FROM node:20.15.1-alpine
|
||||
|
||||
RUN apk update && apk add bash
|
||||
# Create app directory
|
||||
WORKDIR /usr/src/app
|
||||
|
||||
# Install app dependencies
|
||||
# A wildcard is used to ensure both package.json AND package-lock.json are copied
|
||||
# where available (npm@5+)
|
||||
COPY package*.json ./
|
||||
|
||||
RUN yarn install
|
||||
# If you are building your code for production
|
||||
# RUN npm ci --only=production
|
||||
|
||||
|
||||
# Bundle app source
|
||||
COPY . .
|
||||
|
||||
|
||||
EXPOSE 4000
|
||||
|
||||
CMD [ "yarn", "start" ]
|
||||
13
app-shell/README.md
Normal file
13
app-shell/README.md
Normal file
@ -0,0 +1,13 @@
|
||||
#test - template backend,
|
||||
|
||||
#### Run App on local machine:
|
||||
|
||||
##### Install local dependencies:
|
||||
|
||||
- `yarn install`
|
||||
|
||||
---
|
||||
|
||||
##### Start build:
|
||||
|
||||
- `yarn start`
|
||||
42
app-shell/package.json
Normal file
42
app-shell/package.json
Normal file
@ -0,0 +1,42 @@
|
||||
{
|
||||
"name": "app-shell",
|
||||
"description": "app-shell",
|
||||
"scripts": {
|
||||
"start": "node ./src/index.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"@babel/parser": "^7.26.7",
|
||||
"adm-zip": "^0.5.16",
|
||||
"axios": "^1.6.7",
|
||||
"bcrypt": "5.1.1",
|
||||
"cors": "2.8.5",
|
||||
"eslint": "^9.13.0",
|
||||
"express": "4.18.2",
|
||||
"formidable": "1.2.2",
|
||||
"helmet": "4.1.1",
|
||||
"json2csv": "^5.0.7",
|
||||
"jsonwebtoken": "8.5.1",
|
||||
"lodash": "4.17.21",
|
||||
"moment": "2.30.1",
|
||||
"multer": "^1.4.4",
|
||||
"passport": "^0.7.0",
|
||||
"passport-google-oauth2": "^0.2.0",
|
||||
"passport-jwt": "^4.0.1",
|
||||
"passport-microsoft": "^0.1.0",
|
||||
"postcss": "^8.5.1",
|
||||
"sequelize-json-schema": "^2.1.1",
|
||||
"pg": "^8.13.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18"
|
||||
},
|
||||
"private": true,
|
||||
"devDependencies": {
|
||||
"@typescript-eslint/eslint-plugin": "^8.12.2",
|
||||
"@typescript-eslint/parser": "^8.12.2",
|
||||
"cross-env": "7.0.3",
|
||||
"mocha": "8.1.3",
|
||||
"nodemon": "^3.1.7",
|
||||
"sequelize-cli": "6.6.2"
|
||||
}
|
||||
}
|
||||
5
app-shell/src/_schema.json
Normal file
5
app-shell/src/_schema.json
Normal file
File diff suppressed because one or more lines are too long
16
app-shell/src/config.js
Normal file
16
app-shell/src/config.js
Normal file
@ -0,0 +1,16 @@
|
||||
|
||||
|
||||
const config = {
|
||||
schema_encryption_key: process.env.SCHEMA_ENCRYPTION_KEY || '',
|
||||
|
||||
project_uuid: '194c4336-7bad-454e-8302-d3b5ade71d6e',
|
||||
flHost: process.env.NODE_ENV === 'production' ? 'https://flatlogic.com/projects' : 'http://localhost:3000/projects',
|
||||
|
||||
gitea_domain: process.env.GITEA_DOMAIN || 'gitea.flatlogic.app',
|
||||
gitea_username: process.env.GITEA_USERNAME || 'admin',
|
||||
gitea_api_token: process.env.GITEA_API_TOKEN || null,
|
||||
github_repo_url: process.env.GITHUB_REPO_URL || null,
|
||||
github_token: process.env.GITHUB_TOKEN || null,
|
||||
};
|
||||
|
||||
module.exports = config;
|
||||
23
app-shell/src/helpers.js
Normal file
23
app-shell/src/helpers.js
Normal file
@ -0,0 +1,23 @@
|
||||
const jwt = require('jsonwebtoken');
|
||||
const config = require('./config');
|
||||
|
||||
module.exports = class Helpers {
|
||||
static wrapAsync(fn) {
|
||||
return function (req, res, next) {
|
||||
fn(req, res, next).catch(next);
|
||||
};
|
||||
}
|
||||
|
||||
static commonErrorHandler(error, req, res, next) {
|
||||
if ([400, 403, 404].includes(error.code)) {
|
||||
return res.status(error.code).send(error.message);
|
||||
}
|
||||
|
||||
console.error(error);
|
||||
return res.status(500).send(error.message);
|
||||
}
|
||||
|
||||
static jwtSign(data) {
|
||||
return jwt.sign(data, config.secret_key, { expiresIn: '6h' });
|
||||
}
|
||||
};
|
||||
54
app-shell/src/index.js
Normal file
54
app-shell/src/index.js
Normal file
@ -0,0 +1,54 @@
|
||||
const express = require('express');
|
||||
const cors = require('cors');
|
||||
const app = express();
|
||||
const bodyParser = require('body-parser');
|
||||
const checkPermissions = require('./middlewares/check-permissions');
|
||||
const modifyPath = require('./middlewares/modify-path');
|
||||
const VCS = require('./services/vcs');
|
||||
|
||||
const executorRoutes = require('./routes/executor');
|
||||
const vcsRoutes = require('./routes/vcs');
|
||||
|
||||
// Function to initialize the Git repository
|
||||
function initRepo() {
|
||||
const projectId = '31214';
|
||||
return VCS.initRepo(projectId);
|
||||
}
|
||||
|
||||
// Start the Express app on APP_SHELL_PORT (4000)
|
||||
function startServer() {
|
||||
const PORT = 4000;
|
||||
app.listen(PORT, () => {
|
||||
console.log(`Listening on port ${PORT}`);
|
||||
});
|
||||
}
|
||||
|
||||
// Run Git check after the server is up
|
||||
function runGitCheck() {
|
||||
initRepo()
|
||||
.then(result => {
|
||||
console.log(result?.message ? result.message : result);
|
||||
// Here you can add additional logic if needed
|
||||
})
|
||||
.catch(err => {
|
||||
console.error('Error during repo initialization:', err);
|
||||
// Optionally exit the process if Git check is critical:
|
||||
// process.exit(1);
|
||||
});
|
||||
}
|
||||
|
||||
app.use(cors({ origin: true }));
|
||||
app.use(bodyParser.json());
|
||||
app.use(checkPermissions);
|
||||
app.use(modifyPath);
|
||||
|
||||
app.use('/executor', executorRoutes);
|
||||
app.use('/vcs', vcsRoutes);
|
||||
|
||||
// Start the app_shell server
|
||||
startServer();
|
||||
|
||||
// Now perform Git check
|
||||
runGitCheck();
|
||||
|
||||
module.exports = app;
|
||||
17
app-shell/src/middlewares/check-permissions.js
Normal file
17
app-shell/src/middlewares/check-permissions.js
Normal file
@ -0,0 +1,17 @@
|
||||
const config = require('../config');
|
||||
|
||||
function checkPermissions(req, res, next) {
|
||||
const project_uuid = config.project_uuid;
|
||||
const requiredHeader = 'X-Project-UUID';
|
||||
const headerValue = req.headers[requiredHeader.toLowerCase()];
|
||||
// Logging whatever request we're getting
|
||||
console.log('Request:', req.url, req.method, req.body, req.headers);
|
||||
|
||||
if (headerValue && headerValue === project_uuid) {
|
||||
next();
|
||||
} else {
|
||||
res.status(403).send({ error: 'Stop right there, criminal scum! Your project UUID is invalid or missing.' });
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = checkPermissions;
|
||||
8
app-shell/src/middlewares/modify-path.js
Normal file
8
app-shell/src/middlewares/modify-path.js
Normal file
@ -0,0 +1,8 @@
|
||||
function modifyPath(req, res, next) {
|
||||
if (req.body && req.body.path) {
|
||||
req.body.path = '../../../' + req.body.path;
|
||||
}
|
||||
next();
|
||||
}
|
||||
|
||||
module.exports = modifyPath;
|
||||
312
app-shell/src/routes/executor.js
Normal file
312
app-shell/src/routes/executor.js
Normal file
@ -0,0 +1,312 @@
|
||||
const express = require('express');
|
||||
const multer = require('multer');
|
||||
const upload = multer({ dest: 'uploads/' });
|
||||
const fs = require('fs');
|
||||
|
||||
const ExecutorService = require('../services/executor');
|
||||
|
||||
const wrapAsync = require('../helpers').wrapAsync;
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.post(
|
||||
'/read_project_tree',
|
||||
wrapAsync(async (req, res) => {
|
||||
const { path } = req.body;
|
||||
const tree = await ExecutorService.readProjectTree(path);
|
||||
res.status(200).send(tree);
|
||||
}),
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/read_file',
|
||||
wrapAsync(async (req, res) => {
|
||||
const { path, showLines } = req.body;
|
||||
const content = await ExecutorService.readFileContents(path, showLines);
|
||||
res.status(200).send(content);
|
||||
}),
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/count_file_lines',
|
||||
wrapAsync(async (req, res) => {
|
||||
const { path } = req.body;
|
||||
const content = await ExecutorService.countFileLines(path);
|
||||
res.status(200).send(content);
|
||||
}),
|
||||
);
|
||||
|
||||
// router.post(
|
||||
// '/read_file_header',
|
||||
// wrapAsync(async (req, res) => {
|
||||
// const { path, N } = req.body;
|
||||
// try {
|
||||
// const header = await ExecutorService.readFileHeader(path, N);
|
||||
// res.status(200).send(header);
|
||||
// } catch (error) {
|
||||
// res.status(500).send({
|
||||
// error: true,
|
||||
// message: error.message,
|
||||
// details: error.details || error.stack,
|
||||
// validation: error.validation
|
||||
// });
|
||||
// }
|
||||
// }),
|
||||
// );
|
||||
|
||||
router.post(
|
||||
'/read_file_line_context',
|
||||
wrapAsync(async (req, res) => {
|
||||
const { path, lineNumber, windowSize, showLines } = req.body;
|
||||
try {
|
||||
const context = await ExecutorService.readFileLineContext(path, lineNumber, windowSize, showLines);
|
||||
res.status(200).send(context);
|
||||
} catch (error) {
|
||||
res.status(500).send({
|
||||
error: true,
|
||||
message: error.message,
|
||||
details: error.details || error.stack,
|
||||
validation: error.validation
|
||||
});
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/write_file',
|
||||
wrapAsync(async (req, res) => {
|
||||
const { path, fileContents, comment } = req.body;
|
||||
try {
|
||||
await ExecutorService.writeFile(path, fileContents, comment);
|
||||
res.status(200).send({ message: 'File written successfully' });
|
||||
} catch (error) {
|
||||
res.status(500).send({
|
||||
error: true,
|
||||
message: error.message,
|
||||
details: error.details || error.stack,
|
||||
validation: error.validation
|
||||
});
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/insert_file_content',
|
||||
wrapAsync(async (req, res) => {
|
||||
const { path, lineNumber, newContent, message } = req.body;
|
||||
try {
|
||||
await ExecutorService.insertFileContent(path, lineNumber, newContent, message);
|
||||
res.status(200).send({ message: 'File written successfully' });
|
||||
} catch (error) {
|
||||
res.status(500).send({
|
||||
error: true,
|
||||
message: error.message,
|
||||
details: error.details || error.stack,
|
||||
validation: error.validation
|
||||
});
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/replace_file_line',
|
||||
wrapAsync(async (req, res) => {
|
||||
const { path, lineNumber, newText } = req.body;
|
||||
try {
|
||||
const result = await ExecutorService.replaceFileLine(path, lineNumber, newText);
|
||||
res.status(200).send(result);
|
||||
} catch (error) {
|
||||
res.status(500).send({
|
||||
error: true,
|
||||
message: error.message,
|
||||
details: error.details || error.stack,
|
||||
validation: error.validation
|
||||
});
|
||||
}
|
||||
}),
|
||||
);
|
||||
router.post(
|
||||
'/replace_file_chunk',
|
||||
wrapAsync(async (req, res) => {
|
||||
const { path, startLine, endLine, newCode } = req.body;
|
||||
try {
|
||||
const result = await ExecutorService.replaceFileChunk(path, startLine, endLine, newCode);
|
||||
res.status(200).send(result);
|
||||
} catch (error) {
|
||||
res.status(500).send({
|
||||
error: true,
|
||||
message: error.message,
|
||||
details: error.details || error.stack,
|
||||
validation: error.validation
|
||||
});
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/delete_file_lines',
|
||||
wrapAsync(async (req, res) => {
|
||||
const { path, startLine, endLine, message } = req.body;
|
||||
try {
|
||||
const result = await ExecutorService.deleteFileLines(path, startLine, endLine, message);
|
||||
res.status(200).send(result);
|
||||
} catch (error) {
|
||||
res.status(500).send({
|
||||
error: true,
|
||||
message: error.message,
|
||||
details: error.details || error.stack,
|
||||
validation: error.validation
|
||||
});
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/validate_file',
|
||||
wrapAsync(async (req, res) => {
|
||||
const { path } = req.body;
|
||||
try {
|
||||
const validationResult = await ExecutorService.validateFile(path);
|
||||
res.status(200).send({ validationResult });
|
||||
} catch (error) {
|
||||
res.status(500).send({
|
||||
error: true,
|
||||
message: error.message,
|
||||
details: error.details || error.stack,
|
||||
validation: error.validation
|
||||
});
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
|
||||
router.post(
|
||||
'/check_frontend_runtime_error',
|
||||
wrapAsync(async (req, res) => {
|
||||
try {
|
||||
const result = await ExecutorService.checkFrontendRuntimeLogs();
|
||||
res.status(200).send(result);
|
||||
} catch (error) {
|
||||
res.status(500).send({ error: error });
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
|
||||
router.post(
|
||||
'/replace_code_block',
|
||||
wrapAsync(async (req, res) => {
|
||||
const {path, oldCode, newCode, message} = req.body;
|
||||
try {
|
||||
const response = await ExecutorService.replaceCodeBlock(path, oldCode, newCode, message);
|
||||
res.status(200).send(response);
|
||||
} catch (error) {
|
||||
res.status(500).send({
|
||||
error: true,
|
||||
message: error.message,
|
||||
details: error.details || error.stack,
|
||||
validation: error.validation
|
||||
})
|
||||
}
|
||||
})
|
||||
)
|
||||
|
||||
router.post('/update_project_files_from_scheme',
|
||||
upload.single('file'), // 'file' - name of the field in the form
|
||||
async (req, res) => {
|
||||
console.log('Request received');
|
||||
console.log('Headers:', req.headers);
|
||||
if (!req.file) {
|
||||
return res.status(400).json({ error: 'No file uploaded' });
|
||||
}
|
||||
|
||||
console.log('File info:', {
|
||||
originalname: req.file.originalname,
|
||||
path: req.file.path,
|
||||
size: req.file.size,
|
||||
mimetype: req.file.mimetype
|
||||
});
|
||||
|
||||
try {
|
||||
console.log('Starting update process...');
|
||||
const result = await ExecutorService.updateProjectFilesFromScheme(req.file.path);
|
||||
console.log('Update completed, result:', result);
|
||||
|
||||
console.log('Removing temp file...');
|
||||
fs.unlinkSync(req.file.path);
|
||||
console.log('Temp file removed');
|
||||
|
||||
console.log('Sending response...');
|
||||
return res.json(result);
|
||||
} catch (error) {
|
||||
console.error('Error in route handler:', error);
|
||||
if (req.file) {
|
||||
try {
|
||||
fs.unlinkSync(req.file.path);
|
||||
console.log('Temp file removed after error');
|
||||
} catch (unlinkError) {
|
||||
console.error('Error removing temp file:', unlinkError);
|
||||
}
|
||||
}
|
||||
console.error('Update project files error:', error);
|
||||
return res.status(500).json({
|
||||
error: error.message,
|
||||
stack: process.env.NODE_ENV === 'development' ? error.stack : undefined
|
||||
});
|
||||
}
|
||||
}
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/get_db_schema',
|
||||
wrapAsync(async (req, res) => {
|
||||
try {
|
||||
|
||||
const jsonSchema = await ExecutorService.getDBSchema();
|
||||
res.status(200).send({ jsonSchema });
|
||||
} catch (error) {
|
||||
res.status(500).send({ error: error });
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/execute_sql',
|
||||
wrapAsync(async (req, res) => {
|
||||
try {
|
||||
const { query } = req.body;
|
||||
const result = await ExecutorService.executeSQL(query);
|
||||
res.status(200).send(result);
|
||||
} catch (error) {
|
||||
res.status(500).send({ error: error });
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
router.post(
|
||||
'/search_files',
|
||||
wrapAsync(async (req, res) => {
|
||||
try {
|
||||
const { searchStrings } = req.body;
|
||||
|
||||
if (
|
||||
typeof searchStrings !== 'string' &&
|
||||
!(
|
||||
Array.isArray(searchStrings) &&
|
||||
searchStrings.every(item => typeof item === 'string')
|
||||
)
|
||||
) {
|
||||
return res.status(400).send({ error: 'searchStrings must be a string or an array of strings' });
|
||||
}
|
||||
|
||||
const result = await ExecutorService.searchFiles(searchStrings);
|
||||
res.status(200).send(result);
|
||||
} catch (error) {
|
||||
res.status(500).send({ error: error.message });
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
router.use('/', require('../helpers').commonErrorHandler);
|
||||
|
||||
module.exports = router;
|
||||
40
app-shell/src/routes/vcs.js
Normal file
40
app-shell/src/routes/vcs.js
Normal file
@ -0,0 +1,40 @@
|
||||
const express = require('express');
|
||||
const wrapAsync = require('../helpers').wrapAsync; // Ваша обёртка для обработки асинхронных маршрутов
|
||||
const VSC = require('../services/vcs');
|
||||
const router = express.Router();
|
||||
|
||||
router.post('/init', wrapAsync(async (req, res) => {
|
||||
const result = await VSC.initRepo();
|
||||
res.status(200).send(result);
|
||||
}));
|
||||
|
||||
router.post('/commit', wrapAsync(async (req, res) => {
|
||||
const { message, files, dev_schema } = req.body;
|
||||
const result = await VSC.commitChanges(message, files, dev_schema);
|
||||
res.status(200).send(result);
|
||||
}));
|
||||
|
||||
router.post('/log', wrapAsync(async (req, res) => {
|
||||
const result = await VSC.getLog();
|
||||
res.status(200).send(result);
|
||||
}));
|
||||
|
||||
router.post('/rollback', wrapAsync(async (req, res) => {
|
||||
const { ref } = req.body;
|
||||
// const result = await VSC.checkout(ref);
|
||||
const result = await VSC.revert(ref);
|
||||
res.status(200).send(result);
|
||||
}));
|
||||
|
||||
router.post('/sync-to-stable', wrapAsync(async (req, res) => {
|
||||
const result = await VSC.mergeDevIntoMaster();
|
||||
res.status(200).send(result);
|
||||
}));
|
||||
|
||||
router.post('/reset-dev', wrapAsync(async (req, res) => {
|
||||
const result = await VSC.resetDevBranch();
|
||||
res.status(200).send(result);
|
||||
}));
|
||||
|
||||
router.use('/', require('../helpers').commonErrorHandler);
|
||||
module.exports = router;
|
||||
88
app-shell/src/services/database.js
Normal file
88
app-shell/src/services/database.js
Normal file
@ -0,0 +1,88 @@
|
||||
// Database.js
|
||||
const { Client } = require('pg');
|
||||
const config = require('../../../backend/src/db/db.config');
|
||||
|
||||
const env = process.env.NODE_ENV || 'development';
|
||||
const dbConfig = config[env];
|
||||
|
||||
class Database {
|
||||
constructor() {
|
||||
this.client = new Client({
|
||||
user: dbConfig.username,
|
||||
password: dbConfig.password,
|
||||
database: dbConfig.database,
|
||||
host: dbConfig.host,
|
||||
port: dbConfig.port
|
||||
});
|
||||
|
||||
// Connect once, reuse the client
|
||||
this.client.connect().catch(err => {
|
||||
console.error('Error connecting to the database:', err);
|
||||
throw err;
|
||||
});
|
||||
}
|
||||
|
||||
async executeSQL(query) {
|
||||
try {
|
||||
const result = await this.client.query(query);
|
||||
return {
|
||||
success: true,
|
||||
rows: result.rows
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Method to fetch simple table/column info from 'information_schema'
|
||||
// (You can expand this to handle constraints, indexes, etc.)
|
||||
async getDBSchema(schemaName = 'public') {
|
||||
try {
|
||||
const tableQuery = `
|
||||
SELECT table_name
|
||||
FROM information_schema.tables
|
||||
WHERE table_schema = $1
|
||||
AND table_type = 'BASE TABLE'
|
||||
ORDER BY table_name
|
||||
`;
|
||||
|
||||
const columnQuery = `
|
||||
SELECT table_name, column_name, data_type, is_nullable
|
||||
FROM information_schema.columns
|
||||
WHERE table_schema = $1
|
||||
ORDER BY table_name, ordinal_position
|
||||
`;
|
||||
|
||||
const [tablesResult, columnsResult] = await Promise.all([
|
||||
this.client.query(tableQuery, [schemaName]),
|
||||
this.client.query(columnQuery, [schemaName]),
|
||||
]);
|
||||
|
||||
// Build a simple schema object:
|
||||
const tables = tablesResult.rows.map(row => row.table_name);
|
||||
const columnsByTable = {};
|
||||
|
||||
columnsResult.rows.forEach(row => {
|
||||
const { table_name, column_name, data_type, is_nullable } = row;
|
||||
if (!columnsByTable[table_name]) columnsByTable[table_name] = [];
|
||||
columnsByTable[table_name].push({ column_name, data_type, is_nullable });
|
||||
});
|
||||
|
||||
// Combine tables with their columns
|
||||
return tables.map(table => ({
|
||||
table,
|
||||
columns: columnsByTable[table] || [],
|
||||
}));
|
||||
} catch (error) {
|
||||
console.error('Error fetching schema:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async close() {
|
||||
await this.client.end();
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = new Database();
|
||||
1206
app-shell/src/services/executor.js
Normal file
1206
app-shell/src/services/executor.js
Normal file
File diff suppressed because it is too large
Load Diff
16
app-shell/src/services/notifications/errors/forbidden.js
Normal file
16
app-shell/src/services/notifications/errors/forbidden.js
Normal file
@ -0,0 +1,16 @@
|
||||
const { getNotification, isNotification } = require('../helpers');
|
||||
|
||||
module.exports = class ForbiddenError extends Error {
|
||||
constructor(messageCode) {
|
||||
let message;
|
||||
|
||||
if (messageCode && isNotification(messageCode)) {
|
||||
message = getNotification(messageCode);
|
||||
}
|
||||
|
||||
message = message || getNotification('errors.forbidden.message');
|
||||
|
||||
super(message);
|
||||
this.code = 403;
|
||||
}
|
||||
};
|
||||
16
app-shell/src/services/notifications/errors/validation.js
Normal file
16
app-shell/src/services/notifications/errors/validation.js
Normal file
@ -0,0 +1,16 @@
|
||||
const { getNotification, isNotification } = require('../helpers');
|
||||
|
||||
module.exports = class ValidationError extends Error {
|
||||
constructor(messageCode) {
|
||||
let message;
|
||||
|
||||
if (messageCode && isNotification(messageCode)) {
|
||||
message = getNotification(messageCode);
|
||||
}
|
||||
|
||||
message = message || getNotification('errors.validation.message');
|
||||
|
||||
super(message);
|
||||
this.code = 400;
|
||||
}
|
||||
};
|
||||
30
app-shell/src/services/notifications/helpers.js
Normal file
30
app-shell/src/services/notifications/helpers.js
Normal file
@ -0,0 +1,30 @@
|
||||
const _get = require('lodash/get');
|
||||
const errors = require('./list');
|
||||
|
||||
function format(message, args) {
|
||||
if (!message) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return message.replace(/{(\d+)}/g, function (match, number) {
|
||||
return typeof args[number] != 'undefined' ? args[number] : match;
|
||||
});
|
||||
}
|
||||
|
||||
const isNotification = (key) => {
|
||||
const message = _get(errors, key);
|
||||
return !!message;
|
||||
};
|
||||
|
||||
const getNotification = (key, ...args) => {
|
||||
const message = _get(errors, key);
|
||||
|
||||
if (!message) {
|
||||
return key;
|
||||
}
|
||||
|
||||
return format(message, args);
|
||||
};
|
||||
|
||||
exports.getNotification = getNotification;
|
||||
exports.isNotification = isNotification;
|
||||
100
app-shell/src/services/notifications/list.js
Normal file
100
app-shell/src/services/notifications/list.js
Normal file
@ -0,0 +1,100 @@
|
||||
const errors = {
|
||||
app: {
|
||||
title: 'test',
|
||||
},
|
||||
|
||||
auth: {
|
||||
userDisabled: 'Your account is disabled',
|
||||
forbidden: 'Forbidden',
|
||||
unauthorized: 'Unauthorized',
|
||||
userNotFound: `Sorry, we don't recognize your credentials`,
|
||||
wrongPassword: `Sorry, we don't recognize your credentials`,
|
||||
weakPassword: 'This password is too weak',
|
||||
emailAlreadyInUse: 'Email is already in use',
|
||||
invalidEmail: 'Please provide a valid email',
|
||||
passwordReset: {
|
||||
invalidToken: 'Password reset link is invalid or has expired',
|
||||
error: `Email not recognized`,
|
||||
},
|
||||
passwordUpdate: {
|
||||
samePassword: `You can't use the same password. Please create new password`,
|
||||
},
|
||||
userNotVerified: `Sorry, your email has not been verified yet`,
|
||||
emailAddressVerificationEmail: {
|
||||
invalidToken: 'Email verification link is invalid or has expired',
|
||||
error: `Email not recognized`,
|
||||
},
|
||||
},
|
||||
|
||||
iam: {
|
||||
errors: {
|
||||
userAlreadyExists: 'User with this email already exists',
|
||||
userNotFound: 'User not found',
|
||||
disablingHimself: `You can't disable yourself`,
|
||||
revokingOwnPermission: `You can't revoke your own owner permission`,
|
||||
deletingHimself: `You can't delete yourself`,
|
||||
emailRequired: 'Email is required',
|
||||
},
|
||||
},
|
||||
|
||||
importer: {
|
||||
errors: {
|
||||
invalidFileEmpty: 'The file is empty',
|
||||
invalidFileExcel: 'Only excel (.xlsx) files are allowed',
|
||||
invalidFileUpload:
|
||||
'Invalid file. Make sure you are using the last version of the template.',
|
||||
importHashRequired: 'Import hash is required',
|
||||
importHashExistent: 'Data has already been imported',
|
||||
userEmailMissing: 'Some items in the CSV do not have an email',
|
||||
},
|
||||
},
|
||||
|
||||
errors: {
|
||||
forbidden: {
|
||||
message: 'Forbidden',
|
||||
},
|
||||
validation: {
|
||||
message: 'An error occurred',
|
||||
},
|
||||
searchQueryRequired: {
|
||||
message: 'Search query is required',
|
||||
},
|
||||
},
|
||||
|
||||
emails: {
|
||||
invitation: {
|
||||
subject: `You've been invited to {0}`,
|
||||
body: `
|
||||
<p>Hello,</p>
|
||||
<p>You've been invited to {0} set password for your {1} account.</p>
|
||||
<p><a href='{2}'>{2}</a></p>
|
||||
<p>Thanks,</p>
|
||||
<p>Your {0} team</p>
|
||||
`,
|
||||
},
|
||||
emailAddressVerification: {
|
||||
subject: `Verify your email for {0}`,
|
||||
body: `
|
||||
<p>Hello,</p>
|
||||
<p>Follow this link to verify your email address.</p>
|
||||
<p><a href='{0}'>{0}</a></p>
|
||||
<p>If you didn't ask to verify this address, you can ignore this email.</p>
|
||||
<p>Thanks,</p>
|
||||
<p>Your {1} team</p>
|
||||
`,
|
||||
},
|
||||
passwordReset: {
|
||||
subject: `Reset your password for {0}`,
|
||||
body: `
|
||||
<p>Hello,</p>
|
||||
<p>Follow this link to reset your {0} password for your {1} account.</p>
|
||||
<p><a href='{2}'>{2}</a></p>
|
||||
<p>If you didn't ask to reset your password, you can ignore this email.</p>
|
||||
<p>Thanks,</p>
|
||||
<p>Your {0} team</p>
|
||||
`,
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
module.exports = errors;
|
||||
67
app-shell/src/services/project-events.js
Normal file
67
app-shell/src/services/project-events.js
Normal file
@ -0,0 +1,67 @@
|
||||
const axios = require('axios');
|
||||
const config = require('../config.js');
|
||||
|
||||
class ProjectEventsService {
|
||||
/**
|
||||
* Sends a project event to the Rails backend
|
||||
*
|
||||
* @param {string} eventType - Type of the event
|
||||
* @param {object} payload - Event payload data
|
||||
* @param {object} options - Additional options
|
||||
* @param {string} [options.conversationId] - Optional conversation ID
|
||||
* @param {boolean} [options.isError=false] - Whether this is an error event
|
||||
* @returns {Promise<object>} - Response from the webhook
|
||||
*/
|
||||
static async sendEvent(eventType, payload = {}, options = {}) {
|
||||
try {
|
||||
console.log(`[DEBUG] Sending project event: ${eventType}`);
|
||||
|
||||
const webhookUrl = `https://flatlogic.com/projects/events_webhook`;
|
||||
|
||||
// Prepare the event data
|
||||
const eventData = {
|
||||
project_uuid: config.project_uuid,
|
||||
event_type: eventType,
|
||||
payload: {
|
||||
...payload,
|
||||
message: `[APP] ${payload.message}`,
|
||||
is_error: options.isError || false,
|
||||
system_message: true,
|
||||
is_command_info: true
|
||||
}
|
||||
};
|
||||
|
||||
// Add conversation ID if provided
|
||||
if (options.conversationId) {
|
||||
eventData.conversation_id = options.conversationId;
|
||||
}
|
||||
|
||||
const headers = {
|
||||
'Content-Type': 'application/json',
|
||||
'x-project-uuid': config.project_uuid
|
||||
};
|
||||
|
||||
console.log(`[DEBUG] Event data: ${JSON.stringify(eventData)}`);
|
||||
|
||||
const response = await axios.post(webhookUrl, eventData, { headers });
|
||||
|
||||
console.log(`[DEBUG] Event sent successfully, status: ${response.status}`);
|
||||
return response.data;
|
||||
} catch (error) {
|
||||
console.error(`[ERROR] Failed to send project event: ${error.message}`);
|
||||
if (error.response) {
|
||||
console.error(`[ERROR] Response status: ${error.response.status}`);
|
||||
console.error(`[ERROR] Response data: ${JSON.stringify(error.response.data)}`);
|
||||
}
|
||||
|
||||
// Don't throw the error, just return a failed status
|
||||
// This prevents errors in the event service from breaking app functionality
|
||||
return {
|
||||
success: false,
|
||||
error: error.message
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = ProjectEventsService;
|
||||
1205
app-shell/src/services/vcs.js
Normal file
1205
app-shell/src/services/vcs.js
Normal file
File diff suppressed because it is too large
Load Diff
3044
app-shell/yarn.lock
Normal file
3044
app-shell/yarn.lock
Normal file
File diff suppressed because it is too large
Load Diff
11
backend/.prettierrc
Normal file
11
backend/.prettierrc
Normal file
@ -0,0 +1,11 @@
|
||||
{
|
||||
"singleQuote": true,
|
||||
"tabWidth": 2,
|
||||
"printWidth": 80,
|
||||
"trailingComma": "all",
|
||||
"quoteProps": "as-needed",
|
||||
"jsxSingleQuote": true,
|
||||
"bracketSpacing": true,
|
||||
"bracketSameLine": false,
|
||||
"arrowParens": "always"
|
||||
}
|
||||
7
backend/.sequelizerc
Normal file
7
backend/.sequelizerc
Normal file
@ -0,0 +1,7 @@
|
||||
const path = require('path');
|
||||
module.exports = {
|
||||
"config": path.resolve("src", "db", "db.config.js"),
|
||||
"models-path": path.resolve("src", "db", "models"),
|
||||
"seeders-path": path.resolve("src", "db", "seeders"),
|
||||
"migrations-path": path.resolve("src", "db", "migrations")
|
||||
};
|
||||
23
backend/Dockerfile
Normal file
23
backend/Dockerfile
Normal file
@ -0,0 +1,23 @@
|
||||
FROM node:20.15.1-alpine
|
||||
|
||||
RUN apk update && apk add bash
|
||||
# Create app directory
|
||||
WORKDIR /usr/src/app
|
||||
|
||||
# Install app dependencies
|
||||
# A wildcard is used to ensure both package.json AND package-lock.json are copied
|
||||
# where available (npm@5+)
|
||||
COPY package*.json ./
|
||||
|
||||
RUN yarn install
|
||||
# If you are building your code for production
|
||||
# RUN npm ci --only=production
|
||||
|
||||
|
||||
# Bundle app source
|
||||
COPY . .
|
||||
|
||||
|
||||
EXPOSE 8080
|
||||
|
||||
CMD [ "yarn", "start" ]
|
||||
67
backend/README.md
Normal file
67
backend/README.md
Normal file
@ -0,0 +1,67 @@
|
||||
#AI Agent Hub - template backend,
|
||||
|
||||
#### Run App on local machine:
|
||||
|
||||
##### Install local dependencies:
|
||||
|
||||
- `yarn install`
|
||||
|
||||
---
|
||||
|
||||
##### Adjust local db:
|
||||
|
||||
###### 1. Install postgres:
|
||||
|
||||
- MacOS:
|
||||
|
||||
- `brew install postgres`
|
||||
|
||||
- Ubuntu:
|
||||
- `sudo apt update`
|
||||
- `sudo apt install postgresql postgresql-contrib`
|
||||
|
||||
###### 2. Create db and admin user:
|
||||
|
||||
- Before run and test connection, make sure you have created a database as described in the above configuration. You can use the `psql` command to create a user and database.
|
||||
|
||||
- `psql postgres --u postgres`
|
||||
|
||||
- Next, type this command for creating a new user with password then give access for creating the database.
|
||||
|
||||
- `postgres-# CREATE ROLE admin WITH LOGIN PASSWORD 'admin_pass';`
|
||||
- `postgres-# ALTER ROLE admin CREATEDB;`
|
||||
|
||||
- Quit `psql` then log in again using the new user that previously created.
|
||||
|
||||
- `postgres-# \q`
|
||||
- `psql postgres -U admin`
|
||||
|
||||
- Type this command to creating a new database.
|
||||
|
||||
- `postgres=> CREATE DATABASE db_ai_agent_hub;`
|
||||
|
||||
- Then give that new user privileges to the new database then quit the `psql`.
|
||||
- `postgres=> GRANT ALL PRIVILEGES ON DATABASE db_ai_agent_hub TO admin;`
|
||||
- `postgres=> \q`
|
||||
|
||||
---
|
||||
|
||||
#### Api Documentation (Swagger)
|
||||
|
||||
http://localhost:8080/api-docs (local host)
|
||||
|
||||
http://host_name/api-docs
|
||||
|
||||
---
|
||||
|
||||
##### Setup database tables or update after schema change
|
||||
|
||||
- `yarn db:migrate`
|
||||
|
||||
##### Seed the initial data (admin accounts, relevant for the first setup):
|
||||
|
||||
- `yarn db:seed`
|
||||
|
||||
##### Start build:
|
||||
|
||||
- `yarn start`
|
||||
53
backend/package.json
Normal file
53
backend/package.json
Normal file
@ -0,0 +1,53 @@
|
||||
{
|
||||
"name": "aiagenthub",
|
||||
"description": "AI Agent Hub - template backend",
|
||||
"scripts": {
|
||||
"start": "npm run db:migrate && npm run db:seed && npm run watch",
|
||||
"db:migrate": "sequelize-cli db:migrate",
|
||||
"db:seed": "sequelize-cli db:seed:all",
|
||||
"db:drop": "sequelize-cli db:drop",
|
||||
"db:create": "sequelize-cli db:create",
|
||||
"watch": "node watcher.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"@google-cloud/storage": "^5.18.2",
|
||||
"axios": "^1.6.7",
|
||||
"bcrypt": "5.1.1",
|
||||
"chokidar": "^4.0.3",
|
||||
"cors": "2.8.5",
|
||||
"csv-parser": "^3.0.0",
|
||||
"express": "4.18.2",
|
||||
"formidable": "1.2.2",
|
||||
"helmet": "4.1.1",
|
||||
"json2csv": "^5.0.7",
|
||||
"jsonwebtoken": "8.5.1",
|
||||
"lodash": "4.17.21",
|
||||
"moment": "2.30.1",
|
||||
"multer": "^1.4.4",
|
||||
"mysql2": "2.2.5",
|
||||
"nodemailer": "6.9.9",
|
||||
"passport": "^0.7.0",
|
||||
"passport-google-oauth2": "^0.2.0",
|
||||
"passport-jwt": "^4.0.1",
|
||||
"passport-microsoft": "^0.1.0",
|
||||
"pg": "8.4.1",
|
||||
"pg-hstore": "2.3.4",
|
||||
"sequelize": "6.35.2",
|
||||
"sequelize-json-schema": "^2.1.1",
|
||||
"sqlite": "4.0.15",
|
||||
"swagger-jsdoc": "^6.2.8",
|
||||
"swagger-ui-express": "^5.0.0",
|
||||
"tedious": "^18.2.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18"
|
||||
},
|
||||
"private": true,
|
||||
"devDependencies": {
|
||||
"cross-env": "7.0.3",
|
||||
"mocha": "8.1.3",
|
||||
"node-mocks-http": "1.9.0",
|
||||
"nodemon": "2.0.5",
|
||||
"sequelize-cli": "6.6.2"
|
||||
}
|
||||
}
|
||||
79
backend/src/auth/auth.js
Normal file
79
backend/src/auth/auth.js
Normal file
@ -0,0 +1,79 @@
|
||||
const config = require('../config');
|
||||
const providers = config.providers;
|
||||
const helpers = require('../helpers');
|
||||
const db = require('../db/models');
|
||||
|
||||
const passport = require('passport');
|
||||
const JWTstrategy = require('passport-jwt').Strategy;
|
||||
const ExtractJWT = require('passport-jwt').ExtractJwt;
|
||||
const GoogleStrategy = require('passport-google-oauth2').Strategy;
|
||||
const MicrosoftStrategy = require('passport-microsoft').Strategy;
|
||||
const UsersDBApi = require('../db/api/users');
|
||||
|
||||
passport.use(
|
||||
new JWTstrategy(
|
||||
{
|
||||
passReqToCallback: true,
|
||||
secretOrKey: config.secret_key,
|
||||
jwtFromRequest: ExtractJWT.fromAuthHeaderAsBearerToken(),
|
||||
},
|
||||
async (req, token, done) => {
|
||||
try {
|
||||
const user = await UsersDBApi.findBy({ email: token.user.email });
|
||||
|
||||
if (user && user.disabled) {
|
||||
return done(new Error(`User '${user.email}' is disabled`));
|
||||
}
|
||||
|
||||
req.currentUser = user;
|
||||
|
||||
return done(null, user);
|
||||
} catch (error) {
|
||||
done(error);
|
||||
}
|
||||
},
|
||||
),
|
||||
);
|
||||
|
||||
passport.use(
|
||||
new GoogleStrategy(
|
||||
{
|
||||
clientID: config.google.clientId,
|
||||
clientSecret: config.google.clientSecret,
|
||||
callbackURL: config.apiUrl + '/auth/signin/google/callback',
|
||||
passReqToCallback: true,
|
||||
},
|
||||
function (request, accessToken, refreshToken, profile, done) {
|
||||
socialStrategy(profile.email, profile, providers.GOOGLE, done);
|
||||
},
|
||||
),
|
||||
);
|
||||
|
||||
passport.use(
|
||||
new MicrosoftStrategy(
|
||||
{
|
||||
clientID: config.microsoft.clientId,
|
||||
clientSecret: config.microsoft.clientSecret,
|
||||
callbackURL: config.apiUrl + '/auth/signin/microsoft/callback',
|
||||
passReqToCallback: true,
|
||||
},
|
||||
function (request, accessToken, refreshToken, profile, done) {
|
||||
const email = profile._json.mail || profile._json.userPrincipalName;
|
||||
socialStrategy(email, profile, providers.MICROSOFT, done);
|
||||
},
|
||||
),
|
||||
);
|
||||
|
||||
function socialStrategy(email, profile, provider, done) {
|
||||
db.users
|
||||
.findOrCreate({ where: { email, provider } })
|
||||
.then(([user, created]) => {
|
||||
const body = {
|
||||
id: user.id,
|
||||
email: user.email,
|
||||
name: profile.displayName,
|
||||
};
|
||||
const token = helpers.jwtSign({ user: body });
|
||||
return done(null, { token });
|
||||
});
|
||||
}
|
||||
73
backend/src/config.js
Normal file
73
backend/src/config.js
Normal file
@ -0,0 +1,73 @@
|
||||
const os = require('os');
|
||||
|
||||
const config = {
|
||||
gcloud: {
|
||||
bucket: 'fldemo-files',
|
||||
hash: 'df544d042d0bedc191f6e2a9b82d3c92',
|
||||
},
|
||||
bcrypt: {
|
||||
saltRounds: 12,
|
||||
},
|
||||
admin_pass: '194c4336',
|
||||
user_pass: 'd3b5ade71d6e',
|
||||
admin_email: 'admin@flatlogic.com',
|
||||
providers: {
|
||||
LOCAL: 'local',
|
||||
GOOGLE: 'google',
|
||||
MICROSOFT: 'microsoft',
|
||||
},
|
||||
secret_key: process.env.SECRET_KEY || '',
|
||||
remote: '',
|
||||
port: process.env.NODE_ENV === 'production' ? '' : '8080',
|
||||
hostUI: process.env.NODE_ENV === 'production' ? '' : 'http://localhost',
|
||||
portUI: process.env.NODE_ENV === 'production' ? '' : '3000',
|
||||
|
||||
portUIProd: process.env.NODE_ENV === 'production' ? '' : ':3000',
|
||||
|
||||
swaggerUI: process.env.NODE_ENV === 'production' ? '' : 'http://localhost',
|
||||
swaggerPort: process.env.NODE_ENV === 'production' ? '' : ':8080',
|
||||
google: {
|
||||
clientId: process.env.GOOGLE_CLIENT_ID || '',
|
||||
clientSecret: process.env.GOOGLE_CLIENT_SECRET || '',
|
||||
},
|
||||
microsoft: {
|
||||
clientId: process.env.MS_CLIENT_ID || '',
|
||||
clientSecret: process.env.MS_CLIENT_SECRET || '',
|
||||
},
|
||||
uploadDir: os.tmpdir(),
|
||||
email: {
|
||||
from: 'AI Agent Hub <app@flatlogic.app>',
|
||||
host: 'email-smtp.us-east-1.amazonaws.com',
|
||||
port: 587,
|
||||
auth: {
|
||||
user: process.env.EMAIL_USER || '',
|
||||
pass: process.env.EMAIL_PASS,
|
||||
},
|
||||
tls: {
|
||||
rejectUnauthorized: false,
|
||||
},
|
||||
},
|
||||
roles: {
|
||||
admin: 'Administrator',
|
||||
|
||||
user: 'Viewer',
|
||||
},
|
||||
|
||||
project_uuid: '194c4336-7bad-454e-8302-d3b5ade71d6e',
|
||||
flHost:
|
||||
process.env.NODE_ENV === 'production' ||
|
||||
process.env.NODE_ENV === 'dev_stage'
|
||||
? 'https://flatlogic.com/projects'
|
||||
: 'http://localhost:3000/projects',
|
||||
};
|
||||
|
||||
config.pexelsKey = process.env.PEXELS_KEY || '';
|
||||
config.pexelsQuery = 'Abstract AI network concept';
|
||||
config.host =
|
||||
process.env.NODE_ENV === 'production' ? config.remote : 'http://localhost';
|
||||
config.apiUrl = `${config.host}${config.port ? `:${config.port}` : ``}/api`;
|
||||
config.swaggerUrl = `${config.swaggerUI}${config.swaggerPort}`;
|
||||
config.uiUrl = `${config.hostUI}${config.portUI ? `:${config.portUI}` : ``}/#`;
|
||||
config.backUrl = `${config.hostUI}${config.portUI ? `:${config.portUI}` : ``}`;
|
||||
|
||||
module.exports = config;
|
||||
303
backend/src/db/api/accounts.js
Normal file
303
backend/src/db/api/accounts.js
Normal file
@ -0,0 +1,303 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class AccountsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const accounts = await db.accounts.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
name: data.name || null,
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await accounts.setUser(data.user || null, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
return accounts;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const accountsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
name: item.name || null,
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const accounts = await db.accounts.bulkCreate(accountsData, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return accounts;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const accounts = await db.accounts.findByPk(id, {}, { transaction });
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
if (data.name !== undefined) updatePayload.name = data.name;
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await accounts.update(updatePayload, { transaction });
|
||||
|
||||
if (data.user !== undefined) {
|
||||
await accounts.setUser(
|
||||
data.user,
|
||||
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
return accounts;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const accounts = await db.accounts.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of accounts) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of accounts) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return accounts;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const accounts = await db.accounts.findByPk(id, options);
|
||||
|
||||
await accounts.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await accounts.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return accounts;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const accounts = await db.accounts.findOne({ where }, { transaction });
|
||||
|
||||
if (!accounts) {
|
||||
return accounts;
|
||||
}
|
||||
|
||||
const output = accounts.get({ plain: true });
|
||||
|
||||
output.contacts_account = await accounts.getContacts_account({
|
||||
transaction,
|
||||
});
|
||||
|
||||
output.contact_lists_account = await accounts.getContact_lists_account({
|
||||
transaction,
|
||||
});
|
||||
|
||||
output.secure_gmail_tokens_account =
|
||||
await accounts.getSecure_gmail_tokens_account({
|
||||
transaction,
|
||||
});
|
||||
|
||||
output.user = await accounts.getUser({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [
|
||||
{
|
||||
model: db.users,
|
||||
as: 'user',
|
||||
|
||||
where: filter.user
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: filter.user
|
||||
.split('|')
|
||||
.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
firstName: {
|
||||
[Op.or]: filter.user
|
||||
.split('|')
|
||||
.map((term) => ({ [Op.iLike]: `%${term}%` })),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: {},
|
||||
},
|
||||
];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.name) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('accounts', 'name', filter.name),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.accounts.findAndCountAll(queryOptions);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('accounts', 'name', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.accounts.findAll({
|
||||
attributes: ['id', 'name'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['name', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.name,
|
||||
}));
|
||||
}
|
||||
};
|
||||
279
backend/src/db/api/agents.js
Normal file
279
backend/src/db/api/agents.js
Normal file
@ -0,0 +1,279 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class AgentsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const agents = await db.agents.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
name: data.name || null,
|
||||
expertise: data.expertise || null,
|
||||
purpose: data.purpose || null,
|
||||
status: data.status || null,
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return agents;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const agentsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
name: item.name || null,
|
||||
expertise: item.expertise || null,
|
||||
purpose: item.purpose || null,
|
||||
status: item.status || null,
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const agents = await db.agents.bulkCreate(agentsData, { transaction });
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return agents;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const agents = await db.agents.findByPk(id, {}, { transaction });
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
if (data.name !== undefined) updatePayload.name = data.name;
|
||||
|
||||
if (data.expertise !== undefined) updatePayload.expertise = data.expertise;
|
||||
|
||||
if (data.purpose !== undefined) updatePayload.purpose = data.purpose;
|
||||
|
||||
if (data.status !== undefined) updatePayload.status = data.status;
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await agents.update(updatePayload, { transaction });
|
||||
|
||||
return agents;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const agents = await db.agents.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of agents) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of agents) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return agents;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const agents = await db.agents.findByPk(id, options);
|
||||
|
||||
await agents.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await agents.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return agents;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const agents = await db.agents.findOne({ where }, { transaction });
|
||||
|
||||
if (!agents) {
|
||||
return agents;
|
||||
}
|
||||
|
||||
const output = agents.get({ plain: true });
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.name) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('agents', 'name', filter.name),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.expertise) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('agents', 'expertise', filter.expertise),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.purpose) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('agents', 'purpose', filter.purpose),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.status) {
|
||||
where = {
|
||||
...where,
|
||||
status: filter.status,
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.agents.findAndCountAll(queryOptions);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('agents', 'name', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.agents.findAll({
|
||||
attributes: ['id', 'name'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['name', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.name,
|
||||
}));
|
||||
}
|
||||
};
|
||||
247
backend/src/db/api/auto_reply_rules.js
Normal file
247
backend/src/db/api/auto_reply_rules.js
Normal file
@ -0,0 +1,247 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class Auto_reply_rulesDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const auto_reply_rules = await db.auto_reply_rules.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return auto_reply_rules;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const auto_reply_rulesData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const auto_reply_rules = await db.auto_reply_rules.bulkCreate(
|
||||
auto_reply_rulesData,
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return auto_reply_rules;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const auto_reply_rules = await db.auto_reply_rules.findByPk(
|
||||
id,
|
||||
{},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await auto_reply_rules.update(updatePayload, { transaction });
|
||||
|
||||
return auto_reply_rules;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const auto_reply_rules = await db.auto_reply_rules.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of auto_reply_rules) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of auto_reply_rules) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return auto_reply_rules;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const auto_reply_rules = await db.auto_reply_rules.findByPk(id, options);
|
||||
|
||||
await auto_reply_rules.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await auto_reply_rules.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return auto_reply_rules;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const auto_reply_rules = await db.auto_reply_rules.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!auto_reply_rules) {
|
||||
return auto_reply_rules;
|
||||
}
|
||||
|
||||
const output = auto_reply_rules.get({ plain: true });
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.auto_reply_rules.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('auto_reply_rules', 'id', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.auto_reply_rules.findAll({
|
||||
attributes: ['id', 'id'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['id', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.id,
|
||||
}));
|
||||
}
|
||||
};
|
||||
250
backend/src/db/api/contact_list_membership.js
Normal file
250
backend/src/db/api/contact_list_membership.js
Normal file
@ -0,0 +1,250 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class Contact_list_membershipDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_list_membership = await db.contact_list_membership.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return contact_list_membership;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const contact_list_membershipData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const contact_list_membership = await db.contact_list_membership.bulkCreate(
|
||||
contact_list_membershipData,
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return contact_list_membership;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_list_membership = await db.contact_list_membership.findByPk(
|
||||
id,
|
||||
{},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await contact_list_membership.update(updatePayload, { transaction });
|
||||
|
||||
return contact_list_membership;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_list_membership = await db.contact_list_membership.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of contact_list_membership) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of contact_list_membership) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return contact_list_membership;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_list_membership = await db.contact_list_membership.findByPk(
|
||||
id,
|
||||
options,
|
||||
);
|
||||
|
||||
await contact_list_membership.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await contact_list_membership.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return contact_list_membership;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_list_membership = await db.contact_list_membership.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!contact_list_membership) {
|
||||
return contact_list_membership;
|
||||
}
|
||||
|
||||
const output = contact_list_membership.get({ plain: true });
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.contact_list_membership.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('contact_list_membership', 'id', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.contact_list_membership.findAll({
|
||||
attributes: ['id', 'id'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['id', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.id,
|
||||
}));
|
||||
}
|
||||
};
|
||||
288
backend/src/db/api/contact_lists.js
Normal file
288
backend/src/db/api/contact_lists.js
Normal file
@ -0,0 +1,288 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class Contact_listsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_lists = await db.contact_lists.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await contact_lists.setAccount(data.account || null, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
return contact_lists;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const contact_listsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const contact_lists = await db.contact_lists.bulkCreate(contact_listsData, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return contact_lists;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_lists = await db.contact_lists.findByPk(
|
||||
id,
|
||||
{},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await contact_lists.update(updatePayload, { transaction });
|
||||
|
||||
if (data.account !== undefined) {
|
||||
await contact_lists.setAccount(
|
||||
data.account,
|
||||
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
return contact_lists;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_lists = await db.contact_lists.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of contact_lists) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of contact_lists) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return contact_lists;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_lists = await db.contact_lists.findByPk(id, options);
|
||||
|
||||
await contact_lists.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await contact_lists.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return contact_lists;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_lists = await db.contact_lists.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!contact_lists) {
|
||||
return contact_lists;
|
||||
}
|
||||
|
||||
const output = contact_lists.get({ plain: true });
|
||||
|
||||
output.account = await contact_lists.getAccount({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [
|
||||
{
|
||||
model: db.accounts,
|
||||
as: 'account',
|
||||
|
||||
where: filter.account
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: filter.account
|
||||
.split('|')
|
||||
.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
name: {
|
||||
[Op.or]: filter.account
|
||||
.split('|')
|
||||
.map((term) => ({ [Op.iLike]: `%${term}%` })),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: {},
|
||||
},
|
||||
];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.contact_lists.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('contact_lists', 'id', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.contact_lists.findAll({
|
||||
attributes: ['id', 'id'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['id', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.id,
|
||||
}));
|
||||
}
|
||||
};
|
||||
250
backend/src/db/api/contact_sequence_status.js
Normal file
250
backend/src/db/api/contact_sequence_status.js
Normal file
@ -0,0 +1,250 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class Contact_sequence_statusDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_sequence_status = await db.contact_sequence_status.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return contact_sequence_status;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const contact_sequence_statusData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const contact_sequence_status = await db.contact_sequence_status.bulkCreate(
|
||||
contact_sequence_statusData,
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return contact_sequence_status;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_sequence_status = await db.contact_sequence_status.findByPk(
|
||||
id,
|
||||
{},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await contact_sequence_status.update(updatePayload, { transaction });
|
||||
|
||||
return contact_sequence_status;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_sequence_status = await db.contact_sequence_status.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of contact_sequence_status) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of contact_sequence_status) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return contact_sequence_status;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_sequence_status = await db.contact_sequence_status.findByPk(
|
||||
id,
|
||||
options,
|
||||
);
|
||||
|
||||
await contact_sequence_status.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await contact_sequence_status.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return contact_sequence_status;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_sequence_status = await db.contact_sequence_status.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!contact_sequence_status) {
|
||||
return contact_sequence_status;
|
||||
}
|
||||
|
||||
const output = contact_sequence_status.get({ plain: true });
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.contact_sequence_status.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('contact_sequence_status', 'id', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.contact_sequence_status.findAll({
|
||||
attributes: ['id', 'id'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['id', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.id,
|
||||
}));
|
||||
}
|
||||
};
|
||||
330
backend/src/db/api/contact_tags.js
Normal file
330
backend/src/db/api/contact_tags.js
Normal file
@ -0,0 +1,330 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class Contact_tagsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_tags = await db.contact_tags.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await contact_tags.setContact(data.contact || null, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await contact_tags.setTag(data.tag || null, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
return contact_tags;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const contact_tagsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const contact_tags = await db.contact_tags.bulkCreate(contact_tagsData, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return contact_tags;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_tags = await db.contact_tags.findByPk(
|
||||
id,
|
||||
{},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await contact_tags.update(updatePayload, { transaction });
|
||||
|
||||
if (data.contact !== undefined) {
|
||||
await contact_tags.setContact(
|
||||
data.contact,
|
||||
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
if (data.tag !== undefined) {
|
||||
await contact_tags.setTag(
|
||||
data.tag,
|
||||
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
return contact_tags;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_tags = await db.contact_tags.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of contact_tags) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of contact_tags) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return contact_tags;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_tags = await db.contact_tags.findByPk(id, options);
|
||||
|
||||
await contact_tags.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await contact_tags.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return contact_tags;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contact_tags = await db.contact_tags.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!contact_tags) {
|
||||
return contact_tags;
|
||||
}
|
||||
|
||||
const output = contact_tags.get({ plain: true });
|
||||
|
||||
output.contact = await contact_tags.getContact({
|
||||
transaction,
|
||||
});
|
||||
|
||||
output.tag = await contact_tags.getTag({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [
|
||||
{
|
||||
model: db.contacts,
|
||||
as: 'contact',
|
||||
|
||||
where: filter.contact
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: filter.contact
|
||||
.split('|')
|
||||
.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
email: {
|
||||
[Op.or]: filter.contact
|
||||
.split('|')
|
||||
.map((term) => ({ [Op.iLike]: `%${term}%` })),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: {},
|
||||
},
|
||||
|
||||
{
|
||||
model: db.tags,
|
||||
as: 'tag',
|
||||
|
||||
where: filter.tag
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: filter.tag
|
||||
.split('|')
|
||||
.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
name: {
|
||||
[Op.or]: filter.tag
|
||||
.split('|')
|
||||
.map((term) => ({ [Op.iLike]: `%${term}%` })),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: {},
|
||||
},
|
||||
];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.contact_tags.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('contact_tags', 'id', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.contact_tags.findAll({
|
||||
attributes: ['id', 'id'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['id', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.id,
|
||||
}));
|
||||
}
|
||||
};
|
||||
317
backend/src/db/api/contacts.js
Normal file
317
backend/src/db/api/contacts.js
Normal file
@ -0,0 +1,317 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class ContactsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contacts = await db.contacts.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
email: data.email || null,
|
||||
first_name: data.first_name || null,
|
||||
last_name: data.last_name || null,
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await contacts.setAccount(data.account || null, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
return contacts;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const contactsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
email: item.email || null,
|
||||
first_name: item.first_name || null,
|
||||
last_name: item.last_name || null,
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const contacts = await db.contacts.bulkCreate(contactsData, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return contacts;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contacts = await db.contacts.findByPk(id, {}, { transaction });
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
if (data.email !== undefined) updatePayload.email = data.email;
|
||||
|
||||
if (data.first_name !== undefined)
|
||||
updatePayload.first_name = data.first_name;
|
||||
|
||||
if (data.last_name !== undefined) updatePayload.last_name = data.last_name;
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await contacts.update(updatePayload, { transaction });
|
||||
|
||||
if (data.account !== undefined) {
|
||||
await contacts.setAccount(
|
||||
data.account,
|
||||
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
return contacts;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contacts = await db.contacts.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of contacts) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of contacts) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return contacts;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contacts = await db.contacts.findByPk(id, options);
|
||||
|
||||
await contacts.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await contacts.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return contacts;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const contacts = await db.contacts.findOne({ where }, { transaction });
|
||||
|
||||
if (!contacts) {
|
||||
return contacts;
|
||||
}
|
||||
|
||||
const output = contacts.get({ plain: true });
|
||||
|
||||
output.contact_tags_contact = await contacts.getContact_tags_contact({
|
||||
transaction,
|
||||
});
|
||||
|
||||
output.account = await contacts.getAccount({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [
|
||||
{
|
||||
model: db.accounts,
|
||||
as: 'account',
|
||||
|
||||
where: filter.account
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: filter.account
|
||||
.split('|')
|
||||
.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
name: {
|
||||
[Op.or]: filter.account
|
||||
.split('|')
|
||||
.map((term) => ({ [Op.iLike]: `%${term}%` })),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: {},
|
||||
},
|
||||
];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.email) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('contacts', 'email', filter.email),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.first_name) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('contacts', 'first_name', filter.first_name),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.last_name) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('contacts', 'last_name', filter.last_name),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.contacts.findAndCountAll(queryOptions);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('contacts', 'email', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.contacts.findAll({
|
||||
attributes: ['id', 'email'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['email', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.email,
|
||||
}));
|
||||
}
|
||||
};
|
||||
361
backend/src/db/api/conversations.js
Normal file
361
backend/src/db/api/conversations.js
Normal file
@ -0,0 +1,361 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class ConversationsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const conversations = await db.conversations.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
title: data.title || null,
|
||||
createdat: data.createdat || null,
|
||||
updatedat: data.updatedat || null,
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await conversations.setUser(data.user || null, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
return conversations;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const conversationsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
title: item.title || null,
|
||||
createdat: item.createdat || null,
|
||||
updatedat: item.updatedat || null,
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const conversations = await db.conversations.bulkCreate(conversationsData, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return conversations;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const conversations = await db.conversations.findByPk(
|
||||
id,
|
||||
{},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
if (data.title !== undefined) updatePayload.title = data.title;
|
||||
|
||||
if (data.createdat !== undefined) updatePayload.createdat = data.createdat;
|
||||
|
||||
if (data.updatedat !== undefined) updatePayload.updatedat = data.updatedat;
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await conversations.update(updatePayload, { transaction });
|
||||
|
||||
if (data.user !== undefined) {
|
||||
await conversations.setUser(
|
||||
data.user,
|
||||
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
return conversations;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const conversations = await db.conversations.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of conversations) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of conversations) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return conversations;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const conversations = await db.conversations.findByPk(id, options);
|
||||
|
||||
await conversations.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await conversations.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return conversations;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const conversations = await db.conversations.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!conversations) {
|
||||
return conversations;
|
||||
}
|
||||
|
||||
const output = conversations.get({ plain: true });
|
||||
|
||||
output.messages_conversation = await conversations.getMessages_conversation(
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
output.user = await conversations.getUser({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [
|
||||
{
|
||||
model: db.users,
|
||||
as: 'user',
|
||||
|
||||
where: filter.user
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: filter.user
|
||||
.split('|')
|
||||
.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
firstName: {
|
||||
[Op.or]: filter.user
|
||||
.split('|')
|
||||
.map((term) => ({ [Op.iLike]: `%${term}%` })),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: {},
|
||||
},
|
||||
];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.title) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('conversations', 'title', filter.title),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdatRange) {
|
||||
const [start, end] = filter.createdatRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
createdat: {
|
||||
...where.createdat,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
createdat: {
|
||||
...where.createdat,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
if (filter.updatedatRange) {
|
||||
const [start, end] = filter.updatedatRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
updatedat: {
|
||||
...where.updatedat,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
updatedat: {
|
||||
...where.updatedat,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.conversations.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('conversations', 'title', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.conversations.findAll({
|
||||
attributes: ['id', 'title'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['title', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.title,
|
||||
}));
|
||||
}
|
||||
};
|
||||
73
backend/src/db/api/file.js
Normal file
73
backend/src/db/api/file.js
Normal file
@ -0,0 +1,73 @@
|
||||
const db = require('../models');
|
||||
const assert = require('assert');
|
||||
const services = require('../../services/file');
|
||||
|
||||
module.exports = class FileDBApi {
|
||||
static async replaceRelationFiles(relation, rawFiles, options) {
|
||||
assert(relation.belongsTo, 'belongsTo is required');
|
||||
assert(relation.belongsToColumn, 'belongsToColumn is required');
|
||||
assert(relation.belongsToId, 'belongsToId is required');
|
||||
|
||||
let files = [];
|
||||
|
||||
if (Array.isArray(rawFiles)) {
|
||||
files = rawFiles;
|
||||
} else {
|
||||
files = rawFiles ? [rawFiles] : [];
|
||||
}
|
||||
|
||||
await this._removeLegacyFiles(relation, files, options);
|
||||
await this._addFiles(relation, files, options);
|
||||
}
|
||||
|
||||
static async _addFiles(relation, files, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
|
||||
const inexistentFiles = files.filter((file) => !!file.new);
|
||||
|
||||
for (const file of inexistentFiles) {
|
||||
await db.file.create(
|
||||
{
|
||||
belongsTo: relation.belongsTo,
|
||||
belongsToColumn: relation.belongsToColumn,
|
||||
belongsToId: relation.belongsToId,
|
||||
name: file.name,
|
||||
sizeInBytes: file.sizeInBytes,
|
||||
privateUrl: file.privateUrl,
|
||||
publicUrl: file.publicUrl,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
static async _removeLegacyFiles(relation, files, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const filesToDelete = await db.file.findAll({
|
||||
where: {
|
||||
belongsTo: relation.belongsTo,
|
||||
belongsToId: relation.belongsToId,
|
||||
belongsToColumn: relation.belongsToColumn,
|
||||
id: {
|
||||
[db.Sequelize.Op.notIn]: files
|
||||
.filter((file) => !file.new)
|
||||
.map((file) => file.id),
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
for (let file of filesToDelete) {
|
||||
await services.deleteGCloud(file.privateUrl);
|
||||
await file.destroy({
|
||||
transaction,
|
||||
});
|
||||
}
|
||||
}
|
||||
};
|
||||
340
backend/src/db/api/messages.js
Normal file
340
backend/src/db/api/messages.js
Normal file
@ -0,0 +1,340 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class MessagesDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const messages = await db.messages.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
content: data.content || null,
|
||||
sender: data.sender || null,
|
||||
agentname: data.agentname || null,
|
||||
createdat: data.createdat || null,
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await messages.setConversation(data.conversation || null, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
return messages;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const messagesData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
content: item.content || null,
|
||||
sender: item.sender || null,
|
||||
agentname: item.agentname || null,
|
||||
createdat: item.createdat || null,
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const messages = await db.messages.bulkCreate(messagesData, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return messages;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const messages = await db.messages.findByPk(id, {}, { transaction });
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
if (data.content !== undefined) updatePayload.content = data.content;
|
||||
|
||||
if (data.sender !== undefined) updatePayload.sender = data.sender;
|
||||
|
||||
if (data.agentname !== undefined) updatePayload.agentname = data.agentname;
|
||||
|
||||
if (data.createdat !== undefined) updatePayload.createdat = data.createdat;
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await messages.update(updatePayload, { transaction });
|
||||
|
||||
if (data.conversation !== undefined) {
|
||||
await messages.setConversation(
|
||||
data.conversation,
|
||||
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
return messages;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const messages = await db.messages.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of messages) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of messages) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return messages;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const messages = await db.messages.findByPk(id, options);
|
||||
|
||||
await messages.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await messages.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return messages;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const messages = await db.messages.findOne({ where }, { transaction });
|
||||
|
||||
if (!messages) {
|
||||
return messages;
|
||||
}
|
||||
|
||||
const output = messages.get({ plain: true });
|
||||
|
||||
output.conversation = await messages.getConversation({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [
|
||||
{
|
||||
model: db.conversations,
|
||||
as: 'conversation',
|
||||
|
||||
where: filter.conversation
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: filter.conversation
|
||||
.split('|')
|
||||
.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
title: {
|
||||
[Op.or]: filter.conversation
|
||||
.split('|')
|
||||
.map((term) => ({ [Op.iLike]: `%${term}%` })),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: {},
|
||||
},
|
||||
];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.content) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('messages', 'content', filter.content),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.sender) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('messages', 'sender', filter.sender),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.agentname) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('messages', 'agentname', filter.agentname),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdatRange) {
|
||||
const [start, end] = filter.createdatRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
createdat: {
|
||||
...where.createdat,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
createdat: {
|
||||
...where.createdat,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.messages.findAndCountAll(queryOptions);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('messages', 'sender', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.messages.findAll({
|
||||
attributes: ['id', 'sender'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['sender', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.sender,
|
||||
}));
|
||||
}
|
||||
};
|
||||
253
backend/src/db/api/permissions.js
Normal file
253
backend/src/db/api/permissions.js
Normal file
@ -0,0 +1,253 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class PermissionsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const permissions = await db.permissions.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
name: data.name || null,
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return permissions;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const permissionsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
name: item.name || null,
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const permissions = await db.permissions.bulkCreate(permissionsData, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return permissions;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const permissions = await db.permissions.findByPk(id, {}, { transaction });
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
if (data.name !== undefined) updatePayload.name = data.name;
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await permissions.update(updatePayload, { transaction });
|
||||
|
||||
return permissions;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const permissions = await db.permissions.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of permissions) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of permissions) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return permissions;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const permissions = await db.permissions.findByPk(id, options);
|
||||
|
||||
await permissions.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await permissions.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return permissions;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const permissions = await db.permissions.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!permissions) {
|
||||
return permissions;
|
||||
}
|
||||
|
||||
const output = permissions.get({ plain: true });
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.name) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('permissions', 'name', filter.name),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.permissions.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('permissions', 'name', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.permissions.findAll({
|
||||
attributes: ['id', 'name'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['name', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.name,
|
||||
}));
|
||||
}
|
||||
};
|
||||
316
backend/src/db/api/roles.js
Normal file
316
backend/src/db/api/roles.js
Normal file
@ -0,0 +1,316 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class RolesDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const roles = await db.roles.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
name: data.name || null,
|
||||
role_customization: data.role_customization || null,
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await roles.setPermissions(data.permissions || [], {
|
||||
transaction,
|
||||
});
|
||||
|
||||
return roles;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const rolesData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
name: item.name || null,
|
||||
role_customization: item.role_customization || null,
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const roles = await db.roles.bulkCreate(rolesData, { transaction });
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return roles;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const roles = await db.roles.findByPk(id, {}, { transaction });
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
if (data.name !== undefined) updatePayload.name = data.name;
|
||||
|
||||
if (data.role_customization !== undefined)
|
||||
updatePayload.role_customization = data.role_customization;
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await roles.update(updatePayload, { transaction });
|
||||
|
||||
if (data.permissions !== undefined) {
|
||||
await roles.setPermissions(data.permissions, { transaction });
|
||||
}
|
||||
|
||||
return roles;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const roles = await db.roles.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of roles) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of roles) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return roles;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const roles = await db.roles.findByPk(id, options);
|
||||
|
||||
await roles.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await roles.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return roles;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const roles = await db.roles.findOne({ where }, { transaction });
|
||||
|
||||
if (!roles) {
|
||||
return roles;
|
||||
}
|
||||
|
||||
const output = roles.get({ plain: true });
|
||||
|
||||
output.users_app_role = await roles.getUsers_app_role({
|
||||
transaction,
|
||||
});
|
||||
|
||||
output.permissions = await roles.getPermissions({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [
|
||||
{
|
||||
model: db.permissions,
|
||||
as: 'permissions',
|
||||
required: false,
|
||||
},
|
||||
];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.name) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('roles', 'name', filter.name),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.role_customization) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike(
|
||||
'roles',
|
||||
'role_customization',
|
||||
filter.role_customization,
|
||||
),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.permissions) {
|
||||
const searchTerms = filter.permissions.split('|');
|
||||
|
||||
include = [
|
||||
{
|
||||
model: db.permissions,
|
||||
as: 'permissions_filter',
|
||||
required: searchTerms.length > 0,
|
||||
where:
|
||||
searchTerms.length > 0
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: searchTerms.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
name: {
|
||||
[Op.or]: searchTerms.map((term) => ({
|
||||
[Op.iLike]: `%${term}%`,
|
||||
})),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: undefined,
|
||||
},
|
||||
...include,
|
||||
];
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.roles.findAndCountAll(queryOptions);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('roles', 'name', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.roles.findAll({
|
||||
attributes: ['id', 'name'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['name', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.name,
|
||||
}));
|
||||
}
|
||||
};
|
||||
292
backend/src/db/api/secure_gmail_tokens.js
Normal file
292
backend/src/db/api/secure_gmail_tokens.js
Normal file
@ -0,0 +1,292 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class Secure_gmail_tokensDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const secure_gmail_tokens = await db.secure_gmail_tokens.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await secure_gmail_tokens.setAccount(data.account || null, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
return secure_gmail_tokens;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const secure_gmail_tokensData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const secure_gmail_tokens = await db.secure_gmail_tokens.bulkCreate(
|
||||
secure_gmail_tokensData,
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return secure_gmail_tokens;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const secure_gmail_tokens = await db.secure_gmail_tokens.findByPk(
|
||||
id,
|
||||
{},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await secure_gmail_tokens.update(updatePayload, { transaction });
|
||||
|
||||
if (data.account !== undefined) {
|
||||
await secure_gmail_tokens.setAccount(
|
||||
data.account,
|
||||
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
return secure_gmail_tokens;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const secure_gmail_tokens = await db.secure_gmail_tokens.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of secure_gmail_tokens) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of secure_gmail_tokens) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return secure_gmail_tokens;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const secure_gmail_tokens = await db.secure_gmail_tokens.findByPk(
|
||||
id,
|
||||
options,
|
||||
);
|
||||
|
||||
await secure_gmail_tokens.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await secure_gmail_tokens.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return secure_gmail_tokens;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const secure_gmail_tokens = await db.secure_gmail_tokens.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!secure_gmail_tokens) {
|
||||
return secure_gmail_tokens;
|
||||
}
|
||||
|
||||
const output = secure_gmail_tokens.get({ plain: true });
|
||||
|
||||
output.account = await secure_gmail_tokens.getAccount({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [
|
||||
{
|
||||
model: db.accounts,
|
||||
as: 'account',
|
||||
|
||||
where: filter.account
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: filter.account
|
||||
.split('|')
|
||||
.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
name: {
|
||||
[Op.or]: filter.account
|
||||
.split('|')
|
||||
.map((term) => ({ [Op.iLike]: `%${term}%` })),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: {},
|
||||
},
|
||||
];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.secure_gmail_tokens.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('secure_gmail_tokens', 'id', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.secure_gmail_tokens.findAll({
|
||||
attributes: ['id', 'id'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['id', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.id,
|
||||
}));
|
||||
}
|
||||
};
|
||||
242
backend/src/db/api/sent_emails.js
Normal file
242
backend/src/db/api/sent_emails.js
Normal file
@ -0,0 +1,242 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class Sent_emailsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sent_emails = await db.sent_emails.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return sent_emails;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const sent_emailsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const sent_emails = await db.sent_emails.bulkCreate(sent_emailsData, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return sent_emails;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sent_emails = await db.sent_emails.findByPk(id, {}, { transaction });
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await sent_emails.update(updatePayload, { transaction });
|
||||
|
||||
return sent_emails;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sent_emails = await db.sent_emails.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of sent_emails) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of sent_emails) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return sent_emails;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sent_emails = await db.sent_emails.findByPk(id, options);
|
||||
|
||||
await sent_emails.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await sent_emails.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return sent_emails;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sent_emails = await db.sent_emails.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!sent_emails) {
|
||||
return sent_emails;
|
||||
}
|
||||
|
||||
const output = sent_emails.get({ plain: true });
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.sent_emails.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('sent_emails', 'id', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.sent_emails.findAll({
|
||||
attributes: ['id', 'id'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['id', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.id,
|
||||
}));
|
||||
}
|
||||
};
|
||||
250
backend/src/db/api/sequence_assignments.js
Normal file
250
backend/src/db/api/sequence_assignments.js
Normal file
@ -0,0 +1,250 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class Sequence_assignmentsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequence_assignments = await db.sequence_assignments.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return sequence_assignments;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const sequence_assignmentsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const sequence_assignments = await db.sequence_assignments.bulkCreate(
|
||||
sequence_assignmentsData,
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return sequence_assignments;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequence_assignments = await db.sequence_assignments.findByPk(
|
||||
id,
|
||||
{},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await sequence_assignments.update(updatePayload, { transaction });
|
||||
|
||||
return sequence_assignments;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequence_assignments = await db.sequence_assignments.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of sequence_assignments) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of sequence_assignments) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return sequence_assignments;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequence_assignments = await db.sequence_assignments.findByPk(
|
||||
id,
|
||||
options,
|
||||
);
|
||||
|
||||
await sequence_assignments.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await sequence_assignments.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return sequence_assignments;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequence_assignments = await db.sequence_assignments.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!sequence_assignments) {
|
||||
return sequence_assignments;
|
||||
}
|
||||
|
||||
const output = sequence_assignments.get({ plain: true });
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.sequence_assignments.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('sequence_assignments', 'id', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.sequence_assignments.findAll({
|
||||
attributes: ['id', 'id'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['id', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.id,
|
||||
}));
|
||||
}
|
||||
};
|
||||
247
backend/src/db/api/sequence_steps.js
Normal file
247
backend/src/db/api/sequence_steps.js
Normal file
@ -0,0 +1,247 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class Sequence_stepsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequence_steps = await db.sequence_steps.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return sequence_steps;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const sequence_stepsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const sequence_steps = await db.sequence_steps.bulkCreate(
|
||||
sequence_stepsData,
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return sequence_steps;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequence_steps = await db.sequence_steps.findByPk(
|
||||
id,
|
||||
{},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await sequence_steps.update(updatePayload, { transaction });
|
||||
|
||||
return sequence_steps;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequence_steps = await db.sequence_steps.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of sequence_steps) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of sequence_steps) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return sequence_steps;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequence_steps = await db.sequence_steps.findByPk(id, options);
|
||||
|
||||
await sequence_steps.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await sequence_steps.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return sequence_steps;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequence_steps = await db.sequence_steps.findOne(
|
||||
{ where },
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!sequence_steps) {
|
||||
return sequence_steps;
|
||||
}
|
||||
|
||||
const output = sequence_steps.get({ plain: true });
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.sequence_steps.findAndCountAll(
|
||||
queryOptions,
|
||||
);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('sequence_steps', 'id', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.sequence_steps.findAll({
|
||||
attributes: ['id', 'id'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['id', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.id,
|
||||
}));
|
||||
}
|
||||
};
|
||||
237
backend/src/db/api/sequences.js
Normal file
237
backend/src/db/api/sequences.js
Normal file
@ -0,0 +1,237 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class SequencesDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequences = await db.sequences.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return sequences;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const sequencesData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const sequences = await db.sequences.bulkCreate(sequencesData, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return sequences;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequences = await db.sequences.findByPk(id, {}, { transaction });
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await sequences.update(updatePayload, { transaction });
|
||||
|
||||
return sequences;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequences = await db.sequences.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of sequences) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of sequences) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return sequences;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequences = await db.sequences.findByPk(id, options);
|
||||
|
||||
await sequences.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await sequences.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return sequences;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const sequences = await db.sequences.findOne({ where }, { transaction });
|
||||
|
||||
if (!sequences) {
|
||||
return sequences;
|
||||
}
|
||||
|
||||
const output = sequences.get({ plain: true });
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.sequences.findAndCountAll(queryOptions);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('sequences', 'id', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.sequences.findAll({
|
||||
attributes: ['id', 'id'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['id', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.id,
|
||||
}));
|
||||
}
|
||||
};
|
||||
250
backend/src/db/api/tags.js
Normal file
250
backend/src/db/api/tags.js
Normal file
@ -0,0 +1,250 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class TagsDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const tags = await db.tags.create(
|
||||
{
|
||||
id: data.id || undefined,
|
||||
|
||||
name: data.name || null,
|
||||
importHash: data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return tags;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const tagsData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
name: item.name || null,
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const tags = await db.tags.bulkCreate(tagsData, { transaction });
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
return tags;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const tags = await db.tags.findByPk(id, {}, { transaction });
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
if (data.name !== undefined) updatePayload.name = data.name;
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await tags.update(updatePayload, { transaction });
|
||||
|
||||
return tags;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const tags = await db.tags.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of tags) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of tags) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return tags;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const tags = await db.tags.findByPk(id, options);
|
||||
|
||||
await tags.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await tags.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return tags;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const tags = await db.tags.findOne({ where }, { transaction });
|
||||
|
||||
if (!tags) {
|
||||
return tags;
|
||||
}
|
||||
|
||||
const output = tags.get({ plain: true });
|
||||
|
||||
output.contact_tags_tag = await tags.getContact_tags_tag({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.name) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('tags', 'name', filter.name),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.tags.findAndCountAll(queryOptions);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('tags', 'name', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.tags.findAll({
|
||||
attributes: ['id', 'name'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['name', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.name,
|
||||
}));
|
||||
}
|
||||
};
|
||||
751
backend/src/db/api/users.js
Normal file
751
backend/src/db/api/users.js
Normal file
@ -0,0 +1,751 @@
|
||||
const db = require('../models');
|
||||
const FileDBApi = require('./file');
|
||||
const crypto = require('crypto');
|
||||
const Utils = require('../utils');
|
||||
|
||||
const bcrypt = require('bcrypt');
|
||||
const config = require('../../config');
|
||||
|
||||
const Sequelize = db.Sequelize;
|
||||
const Op = Sequelize.Op;
|
||||
|
||||
module.exports = class UsersDBApi {
|
||||
static async create(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const users = await db.users.create(
|
||||
{
|
||||
id: data.data.id || undefined,
|
||||
|
||||
firstName: data.data.firstName || null,
|
||||
lastName: data.data.lastName || null,
|
||||
phoneNumber: data.data.phoneNumber || null,
|
||||
email: data.data.email || null,
|
||||
disabled: data.data.disabled || false,
|
||||
|
||||
password: data.data.password || null,
|
||||
emailVerified: data.data.emailVerified || true,
|
||||
|
||||
emailVerificationToken: data.data.emailVerificationToken || null,
|
||||
emailVerificationTokenExpiresAt:
|
||||
data.data.emailVerificationTokenExpiresAt || null,
|
||||
passwordResetToken: data.data.passwordResetToken || null,
|
||||
passwordResetTokenExpiresAt:
|
||||
data.data.passwordResetTokenExpiresAt || null,
|
||||
provider: data.data.provider || null,
|
||||
importHash: data.data.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
if (!data.data.app_role) {
|
||||
const role = await db.roles.findOne({
|
||||
where: { name: 'User' },
|
||||
});
|
||||
if (role) {
|
||||
await users.setApp_role(role, {
|
||||
transaction,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
await users.setApp_role(data.data.app_role || null, {
|
||||
transaction,
|
||||
});
|
||||
}
|
||||
|
||||
await users.setCustom_permissions(data.data.custom_permissions || [], {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await FileDBApi.replaceRelationFiles(
|
||||
{
|
||||
belongsTo: db.users.getTableName(),
|
||||
belongsToColumn: 'avatar',
|
||||
belongsToId: users.id,
|
||||
},
|
||||
data.data.avatar,
|
||||
options,
|
||||
);
|
||||
|
||||
return users;
|
||||
}
|
||||
|
||||
static async bulkImport(data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
// Prepare data - wrapping individual data transformations in a map() method
|
||||
const usersData = data.map((item, index) => ({
|
||||
id: item.id || undefined,
|
||||
|
||||
firstName: item.firstName || null,
|
||||
lastName: item.lastName || null,
|
||||
phoneNumber: item.phoneNumber || null,
|
||||
email: item.email || null,
|
||||
disabled: item.disabled || false,
|
||||
|
||||
password: item.password || null,
|
||||
emailVerified: item.emailVerified || false,
|
||||
|
||||
emailVerificationToken: item.emailVerificationToken || null,
|
||||
emailVerificationTokenExpiresAt:
|
||||
item.emailVerificationTokenExpiresAt || null,
|
||||
passwordResetToken: item.passwordResetToken || null,
|
||||
passwordResetTokenExpiresAt: item.passwordResetTokenExpiresAt || null,
|
||||
provider: item.provider || null,
|
||||
importHash: item.importHash || null,
|
||||
createdById: currentUser.id,
|
||||
updatedById: currentUser.id,
|
||||
createdAt: new Date(Date.now() + index * 1000),
|
||||
}));
|
||||
|
||||
// Bulk create items
|
||||
const users = await db.users.bulkCreate(usersData, { transaction });
|
||||
|
||||
// For each item created, replace relation files
|
||||
|
||||
for (let i = 0; i < users.length; i++) {
|
||||
await FileDBApi.replaceRelationFiles(
|
||||
{
|
||||
belongsTo: db.users.getTableName(),
|
||||
belongsToColumn: 'avatar',
|
||||
belongsToId: users[i].id,
|
||||
},
|
||||
data[i].avatar,
|
||||
options,
|
||||
);
|
||||
}
|
||||
|
||||
return users;
|
||||
}
|
||||
|
||||
static async update(id, data, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const users = await db.users.findByPk(id, {}, { transaction });
|
||||
|
||||
if (!data?.app_role) {
|
||||
data.app_role = users?.app_role?.id;
|
||||
}
|
||||
if (!data?.custom_permissions) {
|
||||
data.custom_permissions = users?.custom_permissions?.map(
|
||||
(item) => item.id,
|
||||
);
|
||||
}
|
||||
|
||||
if (data.password) {
|
||||
data.password = bcrypt.hashSync(data.password, config.bcrypt.saltRounds);
|
||||
} else {
|
||||
data.password = users.password;
|
||||
}
|
||||
|
||||
const updatePayload = {};
|
||||
|
||||
if (data.firstName !== undefined) updatePayload.firstName = data.firstName;
|
||||
|
||||
if (data.lastName !== undefined) updatePayload.lastName = data.lastName;
|
||||
|
||||
if (data.phoneNumber !== undefined)
|
||||
updatePayload.phoneNumber = data.phoneNumber;
|
||||
|
||||
if (data.email !== undefined) updatePayload.email = data.email;
|
||||
|
||||
if (data.disabled !== undefined) updatePayload.disabled = data.disabled;
|
||||
|
||||
if (data.password !== undefined) updatePayload.password = data.password;
|
||||
|
||||
if (data.emailVerified !== undefined)
|
||||
updatePayload.emailVerified = data.emailVerified;
|
||||
else updatePayload.emailVerified = true;
|
||||
|
||||
if (data.emailVerificationToken !== undefined)
|
||||
updatePayload.emailVerificationToken = data.emailVerificationToken;
|
||||
|
||||
if (data.emailVerificationTokenExpiresAt !== undefined)
|
||||
updatePayload.emailVerificationTokenExpiresAt =
|
||||
data.emailVerificationTokenExpiresAt;
|
||||
|
||||
if (data.passwordResetToken !== undefined)
|
||||
updatePayload.passwordResetToken = data.passwordResetToken;
|
||||
|
||||
if (data.passwordResetTokenExpiresAt !== undefined)
|
||||
updatePayload.passwordResetTokenExpiresAt =
|
||||
data.passwordResetTokenExpiresAt;
|
||||
|
||||
if (data.provider !== undefined) updatePayload.provider = data.provider;
|
||||
|
||||
updatePayload.updatedById = currentUser.id;
|
||||
|
||||
await users.update(updatePayload, { transaction });
|
||||
|
||||
if (data.app_role !== undefined) {
|
||||
await users.setApp_role(
|
||||
data.app_role,
|
||||
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
if (data.custom_permissions !== undefined) {
|
||||
await users.setCustom_permissions(data.custom_permissions, {
|
||||
transaction,
|
||||
});
|
||||
}
|
||||
|
||||
await FileDBApi.replaceRelationFiles(
|
||||
{
|
||||
belongsTo: db.users.getTableName(),
|
||||
belongsToColumn: 'avatar',
|
||||
belongsToId: users.id,
|
||||
},
|
||||
data.avatar,
|
||||
options,
|
||||
);
|
||||
|
||||
return users;
|
||||
}
|
||||
|
||||
static async deleteByIds(ids, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const users = await db.users.findAll({
|
||||
where: {
|
||||
id: {
|
||||
[Op.in]: ids,
|
||||
},
|
||||
},
|
||||
transaction,
|
||||
});
|
||||
|
||||
await db.sequelize.transaction(async (transaction) => {
|
||||
for (const record of users) {
|
||||
await record.update({ deletedBy: currentUser.id }, { transaction });
|
||||
}
|
||||
for (const record of users) {
|
||||
await record.destroy({ transaction });
|
||||
}
|
||||
});
|
||||
|
||||
return users;
|
||||
}
|
||||
|
||||
static async remove(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const users = await db.users.findByPk(id, options);
|
||||
|
||||
await users.update(
|
||||
{
|
||||
deletedBy: currentUser.id,
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
await users.destroy({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return users;
|
||||
}
|
||||
|
||||
static async findBy(where, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const users = await db.users.findOne({ where }, { transaction });
|
||||
|
||||
if (!users) {
|
||||
return users;
|
||||
}
|
||||
|
||||
const output = users.get({ plain: true });
|
||||
|
||||
output.conversations_user = await users.getConversations_user({
|
||||
transaction,
|
||||
});
|
||||
|
||||
output.accounts_user = await users.getAccounts_user({
|
||||
transaction,
|
||||
});
|
||||
|
||||
output.avatar = await users.getAvatar({
|
||||
transaction,
|
||||
});
|
||||
|
||||
output.app_role = await users.getApp_role({
|
||||
transaction,
|
||||
});
|
||||
|
||||
if (output.app_role) {
|
||||
output.app_role_permissions = await output.app_role.getPermissions({
|
||||
transaction,
|
||||
});
|
||||
}
|
||||
|
||||
output.custom_permissions = await users.getCustom_permissions({
|
||||
transaction,
|
||||
});
|
||||
|
||||
return output;
|
||||
}
|
||||
|
||||
static async findAll(filter, options) {
|
||||
const limit = filter.limit || 0;
|
||||
let offset = 0;
|
||||
let where = {};
|
||||
const currentPage = +filter.page;
|
||||
|
||||
offset = currentPage * limit;
|
||||
|
||||
const orderBy = null;
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
let include = [
|
||||
{
|
||||
model: db.roles,
|
||||
as: 'app_role',
|
||||
|
||||
where: filter.app_role
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: filter.app_role
|
||||
.split('|')
|
||||
.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
name: {
|
||||
[Op.or]: filter.app_role
|
||||
.split('|')
|
||||
.map((term) => ({ [Op.iLike]: `%${term}%` })),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: {},
|
||||
},
|
||||
|
||||
{
|
||||
model: db.permissions,
|
||||
as: 'custom_permissions',
|
||||
required: false,
|
||||
},
|
||||
|
||||
{
|
||||
model: db.file,
|
||||
as: 'avatar',
|
||||
},
|
||||
];
|
||||
|
||||
if (filter) {
|
||||
if (filter.id) {
|
||||
where = {
|
||||
...where,
|
||||
['id']: Utils.uuid(filter.id),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.firstName) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('users', 'firstName', filter.firstName),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.lastName) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('users', 'lastName', filter.lastName),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.phoneNumber) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('users', 'phoneNumber', filter.phoneNumber),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.email) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('users', 'email', filter.email),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.password) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('users', 'password', filter.password),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.emailVerificationToken) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike(
|
||||
'users',
|
||||
'emailVerificationToken',
|
||||
filter.emailVerificationToken,
|
||||
),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.passwordResetToken) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike(
|
||||
'users',
|
||||
'passwordResetToken',
|
||||
filter.passwordResetToken,
|
||||
),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.provider) {
|
||||
where = {
|
||||
...where,
|
||||
[Op.and]: Utils.ilike('users', 'provider', filter.provider),
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.emailVerificationTokenExpiresAtRange) {
|
||||
const [start, end] = filter.emailVerificationTokenExpiresAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
emailVerificationTokenExpiresAt: {
|
||||
...where.emailVerificationTokenExpiresAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
emailVerificationTokenExpiresAt: {
|
||||
...where.emailVerificationTokenExpiresAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
if (filter.passwordResetTokenExpiresAtRange) {
|
||||
const [start, end] = filter.passwordResetTokenExpiresAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
passwordResetTokenExpiresAt: {
|
||||
...where.passwordResetTokenExpiresAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
passwordResetTokenExpiresAt: {
|
||||
...where.passwordResetTokenExpiresAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
if (filter.active !== undefined) {
|
||||
where = {
|
||||
...where,
|
||||
active: filter.active === true || filter.active === 'true',
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.disabled) {
|
||||
where = {
|
||||
...where,
|
||||
disabled: filter.disabled,
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.emailVerified) {
|
||||
where = {
|
||||
...where,
|
||||
emailVerified: filter.emailVerified,
|
||||
};
|
||||
}
|
||||
|
||||
if (filter.custom_permissions) {
|
||||
const searchTerms = filter.custom_permissions.split('|');
|
||||
|
||||
include = [
|
||||
{
|
||||
model: db.permissions,
|
||||
as: 'custom_permissions_filter',
|
||||
required: searchTerms.length > 0,
|
||||
where:
|
||||
searchTerms.length > 0
|
||||
? {
|
||||
[Op.or]: [
|
||||
{
|
||||
id: {
|
||||
[Op.in]: searchTerms.map((term) => Utils.uuid(term)),
|
||||
},
|
||||
},
|
||||
{
|
||||
name: {
|
||||
[Op.or]: searchTerms.map((term) => ({
|
||||
[Op.iLike]: `%${term}%`,
|
||||
})),
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
: undefined,
|
||||
},
|
||||
...include,
|
||||
];
|
||||
}
|
||||
|
||||
if (filter.createdAtRange) {
|
||||
const [start, end] = filter.createdAtRange;
|
||||
|
||||
if (start !== undefined && start !== null && start !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.gte]: start,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
if (end !== undefined && end !== null && end !== '') {
|
||||
where = {
|
||||
...where,
|
||||
['createdAt']: {
|
||||
...where.createdAt,
|
||||
[Op.lte]: end,
|
||||
},
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const queryOptions = {
|
||||
where,
|
||||
include,
|
||||
distinct: true,
|
||||
order:
|
||||
filter.field && filter.sort
|
||||
? [[filter.field, filter.sort]]
|
||||
: [['createdAt', 'desc']],
|
||||
transaction: options?.transaction,
|
||||
logging: console.log,
|
||||
};
|
||||
|
||||
if (!options?.countOnly) {
|
||||
queryOptions.limit = limit ? Number(limit) : undefined;
|
||||
queryOptions.offset = offset ? Number(offset) : undefined;
|
||||
}
|
||||
|
||||
try {
|
||||
const { rows, count } = await db.users.findAndCountAll(queryOptions);
|
||||
|
||||
return {
|
||||
rows: options?.countOnly ? [] : rows,
|
||||
count: count,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error executing query:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
static async findAllAutocomplete(query, limit, offset) {
|
||||
let where = {};
|
||||
|
||||
if (query) {
|
||||
where = {
|
||||
[Op.or]: [
|
||||
{ ['id']: Utils.uuid(query) },
|
||||
Utils.ilike('users', 'firstName', query),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const records = await db.users.findAll({
|
||||
attributes: ['id', 'firstName'],
|
||||
where,
|
||||
limit: limit ? Number(limit) : undefined,
|
||||
offset: offset ? Number(offset) : undefined,
|
||||
orderBy: [['firstName', 'ASC']],
|
||||
});
|
||||
|
||||
return records.map((record) => ({
|
||||
id: record.id,
|
||||
label: record.firstName,
|
||||
}));
|
||||
}
|
||||
|
||||
static async createFromAuth(data, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
const users = await db.users.create(
|
||||
{
|
||||
email: data.email,
|
||||
firstName: data.firstName,
|
||||
authenticationUid: data.authenticationUid,
|
||||
password: data.password,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
const app_role = await db.roles.findOne({
|
||||
where: { name: config.roles?.user || 'User' },
|
||||
});
|
||||
if (app_role?.id) {
|
||||
await users.setApp_role(app_role?.id || null, {
|
||||
transaction,
|
||||
});
|
||||
}
|
||||
|
||||
await users.update(
|
||||
{
|
||||
authenticationUid: users.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
delete users.password;
|
||||
return users;
|
||||
}
|
||||
|
||||
static async updatePassword(id, password, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const users = await db.users.findByPk(id, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await users.update(
|
||||
{
|
||||
password,
|
||||
authenticationUid: id,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return users;
|
||||
}
|
||||
|
||||
static async generateEmailVerificationToken(email, options) {
|
||||
return this._generateToken(
|
||||
['emailVerificationToken', 'emailVerificationTokenExpiresAt'],
|
||||
email,
|
||||
options,
|
||||
);
|
||||
}
|
||||
|
||||
static async generatePasswordResetToken(email, options) {
|
||||
return this._generateToken(
|
||||
['passwordResetToken', 'passwordResetTokenExpiresAt'],
|
||||
email,
|
||||
options,
|
||||
);
|
||||
}
|
||||
|
||||
static async findByPasswordResetToken(token, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
return db.users.findOne(
|
||||
{
|
||||
where: {
|
||||
passwordResetToken: token,
|
||||
passwordResetTokenExpiresAt: {
|
||||
[db.Sequelize.Op.gt]: Date.now(),
|
||||
},
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
static async findByEmailVerificationToken(token, options) {
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
return db.users.findOne(
|
||||
{
|
||||
where: {
|
||||
emailVerificationToken: token,
|
||||
emailVerificationTokenExpiresAt: {
|
||||
[db.Sequelize.Op.gt]: Date.now(),
|
||||
},
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
static async markEmailVerified(id, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
|
||||
const users = await db.users.findByPk(id, {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await users.update(
|
||||
{
|
||||
emailVerified: true,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
static async _generateToken(keyNames, email, options) {
|
||||
const currentUser = (options && options.currentUser) || { id: null };
|
||||
const transaction = (options && options.transaction) || undefined;
|
||||
const users = await db.users.findOne(
|
||||
{
|
||||
where: { email: email.toLowerCase() },
|
||||
},
|
||||
{
|
||||
transaction,
|
||||
},
|
||||
);
|
||||
|
||||
const token = crypto.randomBytes(20).toString('hex');
|
||||
const tokenExpiresAt = Date.now() + 360000;
|
||||
|
||||
if (users) {
|
||||
await users.update(
|
||||
{
|
||||
[keyNames[0]]: token,
|
||||
[keyNames[1]]: tokenExpiresAt,
|
||||
updatedById: currentUser.id,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
}
|
||||
|
||||
return token;
|
||||
}
|
||||
};
|
||||
31
backend/src/db/db.config.js
Normal file
31
backend/src/db/db.config.js
Normal file
@ -0,0 +1,31 @@
|
||||
module.exports = {
|
||||
production: {
|
||||
dialect: 'postgres',
|
||||
username: process.env.DB_USER,
|
||||
password: process.env.DB_PASS,
|
||||
database: process.env.DB_NAME,
|
||||
host: process.env.DB_HOST,
|
||||
port: process.env.DB_PORT,
|
||||
logging: console.log,
|
||||
seederStorage: 'sequelize',
|
||||
},
|
||||
development: {
|
||||
username: 'postgres',
|
||||
dialect: 'postgres',
|
||||
password: '',
|
||||
database: 'db_ai_agent_hub',
|
||||
host: process.env.DB_HOST || 'localhost',
|
||||
logging: console.log,
|
||||
seederStorage: 'sequelize',
|
||||
},
|
||||
dev_stage: {
|
||||
dialect: 'postgres',
|
||||
username: process.env.DB_USER,
|
||||
password: process.env.DB_PASS,
|
||||
database: process.env.DB_NAME,
|
||||
host: process.env.DB_HOST,
|
||||
port: process.env.DB_PORT,
|
||||
logging: console.log,
|
||||
seederStorage: 'sequelize',
|
||||
},
|
||||
};
|
||||
431
backend/src/db/migrations/1746169387299.js
Normal file
431
backend/src/db/migrations/1746169387299.js
Normal file
@ -0,0 +1,431 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'users',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.createTable(
|
||||
'agents',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.createTable(
|
||||
'roles',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.createTable(
|
||||
'permissions',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'firstName',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'lastName',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'phoneNumber',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'email',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'disabled',
|
||||
{
|
||||
type: Sequelize.DataTypes.BOOLEAN,
|
||||
|
||||
defaultValue: false,
|
||||
allowNull: false,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'password',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'emailVerified',
|
||||
{
|
||||
type: Sequelize.DataTypes.BOOLEAN,
|
||||
|
||||
defaultValue: false,
|
||||
allowNull: false,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'emailVerificationToken',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'emailVerificationTokenExpiresAt',
|
||||
{
|
||||
type: Sequelize.DataTypes.DATE,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'passwordResetToken',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'passwordResetTokenExpiresAt',
|
||||
{
|
||||
type: Sequelize.DataTypes.DATE,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'provider',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'agents',
|
||||
'name',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'agents',
|
||||
'expertise',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'agents',
|
||||
'purpose',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'agents',
|
||||
'status',
|
||||
{
|
||||
type: Sequelize.DataTypes.ENUM,
|
||||
|
||||
values: ['active', 'inactive', 'development'],
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'permissions',
|
||||
'name',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'roles',
|
||||
'name',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'roles',
|
||||
'role_customization',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.addColumn(
|
||||
'users',
|
||||
'app_roleId',
|
||||
{
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
|
||||
references: {
|
||||
model: 'roles',
|
||||
key: 'id',
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('users', 'app_roleId', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('roles', 'role_customization', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await queryInterface.removeColumn('roles', 'name', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('permissions', 'name', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('agents', 'status', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('agents', 'purpose', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('agents', 'expertise', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('agents', 'name', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('users', 'provider', { transaction });
|
||||
|
||||
await queryInterface.removeColumn(
|
||||
'users',
|
||||
'passwordResetTokenExpiresAt',
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.removeColumn('users', 'passwordResetToken', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await queryInterface.removeColumn(
|
||||
'users',
|
||||
'emailVerificationTokenExpiresAt',
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await queryInterface.removeColumn('users', 'emailVerificationToken', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await queryInterface.removeColumn('users', 'emailVerified', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await queryInterface.removeColumn('users', 'password', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('users', 'disabled', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('users', 'email', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('users', 'phoneNumber', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await queryInterface.removeColumn('users', 'lastName', { transaction });
|
||||
|
||||
await queryInterface.removeColumn('users', 'firstName', { transaction });
|
||||
|
||||
await queryInterface.dropTable('permissions', { transaction });
|
||||
|
||||
await queryInterface.dropTable('roles', { transaction });
|
||||
|
||||
await queryInterface.dropTable('agents', { transaction });
|
||||
|
||||
await queryInterface.dropTable('users', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746173616791.js
Normal file
72
backend/src/db/migrations/1746173616791.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'conversations',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('conversations', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
49
backend/src/db/migrations/1746173666908.js
Normal file
49
backend/src/db/migrations/1746173666908.js
Normal file
@ -0,0 +1,49 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'conversations',
|
||||
'title',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('conversations', 'title', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
54
backend/src/db/migrations/1746173712387.js
Normal file
54
backend/src/db/migrations/1746173712387.js
Normal file
@ -0,0 +1,54 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'conversations',
|
||||
'userId',
|
||||
{
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
|
||||
references: {
|
||||
model: 'users',
|
||||
key: 'id',
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('conversations', 'userId', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
49
backend/src/db/migrations/1746173747920.js
Normal file
49
backend/src/db/migrations/1746173747920.js
Normal file
@ -0,0 +1,49 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'conversations',
|
||||
'createdat',
|
||||
{
|
||||
type: Sequelize.DataTypes.DATE,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('conversations', 'createdat', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
49
backend/src/db/migrations/1746173773482.js
Normal file
49
backend/src/db/migrations/1746173773482.js
Normal file
@ -0,0 +1,49 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'conversations',
|
||||
'updatedat',
|
||||
{
|
||||
type: Sequelize.DataTypes.DATE,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('conversations', 'updatedat', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746173790909.js
Normal file
72
backend/src/db/migrations/1746173790909.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'messages',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('messages', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
54
backend/src/db/migrations/1746173812711.js
Normal file
54
backend/src/db/migrations/1746173812711.js
Normal file
@ -0,0 +1,54 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'messages',
|
||||
'conversationId',
|
||||
{
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
|
||||
references: {
|
||||
model: 'conversations',
|
||||
key: 'id',
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('messages', 'conversationId', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
47
backend/src/db/migrations/1746173835894.js
Normal file
47
backend/src/db/migrations/1746173835894.js
Normal file
@ -0,0 +1,47 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'messages',
|
||||
'content',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('messages', 'content', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
47
backend/src/db/migrations/1746173858091.js
Normal file
47
backend/src/db/migrations/1746173858091.js
Normal file
@ -0,0 +1,47 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'messages',
|
||||
'sender',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('messages', 'sender', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
49
backend/src/db/migrations/1746173878197.js
Normal file
49
backend/src/db/migrations/1746173878197.js
Normal file
@ -0,0 +1,49 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'messages',
|
||||
'agentname',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('messages', 'agentname', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
49
backend/src/db/migrations/1746173895643.js
Normal file
49
backend/src/db/migrations/1746173895643.js
Normal file
@ -0,0 +1,49 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'messages',
|
||||
'createdat',
|
||||
{
|
||||
type: Sequelize.DataTypes.DATE,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('messages', 'createdat', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746184984998.js
Normal file
72
backend/src/db/migrations/1746184984998.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'accounts',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('accounts', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746185038547.js
Normal file
72
backend/src/db/migrations/1746185038547.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'contacts',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('contacts', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746185071635.js
Normal file
72
backend/src/db/migrations/1746185071635.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'contact_lists',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('contact_lists', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
47
backend/src/db/migrations/1746185110536.js
Normal file
47
backend/src/db/migrations/1746185110536.js
Normal file
@ -0,0 +1,47 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'accounts',
|
||||
'name',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('accounts', 'name', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
52
backend/src/db/migrations/1746185137721.js
Normal file
52
backend/src/db/migrations/1746185137721.js
Normal file
@ -0,0 +1,52 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'accounts',
|
||||
'userId',
|
||||
{
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
|
||||
references: {
|
||||
model: 'users',
|
||||
key: 'id',
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('accounts', 'userId', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
54
backend/src/db/migrations/1746185173281.js
Normal file
54
backend/src/db/migrations/1746185173281.js
Normal file
@ -0,0 +1,54 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'contacts',
|
||||
'accountId',
|
||||
{
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
|
||||
references: {
|
||||
model: 'accounts',
|
||||
key: 'id',
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('contacts', 'accountId', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746185211887.js
Normal file
72
backend/src/db/migrations/1746185211887.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'tags',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('tags', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
47
backend/src/db/migrations/1746185257237.js
Normal file
47
backend/src/db/migrations/1746185257237.js
Normal file
@ -0,0 +1,47 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'tags',
|
||||
'name',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('tags', 'name', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746185290773.js
Normal file
72
backend/src/db/migrations/1746185290773.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'contact_tags',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('contact_tags', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
54
backend/src/db/migrations/1746185350189.js
Normal file
54
backend/src/db/migrations/1746185350189.js
Normal file
@ -0,0 +1,54 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'contact_tags',
|
||||
'contactId',
|
||||
{
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
|
||||
references: {
|
||||
model: 'contacts',
|
||||
key: 'id',
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('contact_tags', 'contactId', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
54
backend/src/db/migrations/1746185379257.js
Normal file
54
backend/src/db/migrations/1746185379257.js
Normal file
@ -0,0 +1,54 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'contact_tags',
|
||||
'tagId',
|
||||
{
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
|
||||
references: {
|
||||
model: 'tags',
|
||||
key: 'id',
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('contact_tags', 'tagId', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
54
backend/src/db/migrations/1746185434891.js
Normal file
54
backend/src/db/migrations/1746185434891.js
Normal file
@ -0,0 +1,54 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'contact_lists',
|
||||
'accountId',
|
||||
{
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
|
||||
references: {
|
||||
model: 'accounts',
|
||||
key: 'id',
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('contact_lists', 'accountId', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
74
backend/src/db/migrations/1746185475329.js
Normal file
74
backend/src/db/migrations/1746185475329.js
Normal file
@ -0,0 +1,74 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'contact_list_membership',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('contact_list_membership', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746185539174.js
Normal file
72
backend/src/db/migrations/1746185539174.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'sequences',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('sequences', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746185568402.js
Normal file
72
backend/src/db/migrations/1746185568402.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'sequence_steps',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('sequence_steps', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746185601794.js
Normal file
72
backend/src/db/migrations/1746185601794.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'sequence_assignments',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('sequence_assignments', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
74
backend/src/db/migrations/1746185648366.js
Normal file
74
backend/src/db/migrations/1746185648366.js
Normal file
@ -0,0 +1,74 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'contact_sequence_status',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('contact_sequence_status', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746185688485.js
Normal file
72
backend/src/db/migrations/1746185688485.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'sent_emails',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('sent_emails', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746185725000.js
Normal file
72
backend/src/db/migrations/1746185725000.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'auto_reply_rules',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('auto_reply_rules', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
72
backend/src/db/migrations/1746185754182.js
Normal file
72
backend/src/db/migrations/1746185754182.js
Normal file
@ -0,0 +1,72 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.createTable(
|
||||
'secure_gmail_tokens',
|
||||
{
|
||||
id: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
defaultValue: Sequelize.DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
createdById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
updatedById: {
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
references: {
|
||||
key: 'id',
|
||||
model: 'users',
|
||||
},
|
||||
},
|
||||
createdAt: { type: Sequelize.DataTypes.DATE },
|
||||
updatedAt: { type: Sequelize.DataTypes.DATE },
|
||||
deletedAt: { type: Sequelize.DataTypes.DATE },
|
||||
importHash: {
|
||||
type: Sequelize.DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.dropTable('secure_gmail_tokens', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
54
backend/src/db/migrations/1746185784948.js
Normal file
54
backend/src/db/migrations/1746185784948.js
Normal file
@ -0,0 +1,54 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'secure_gmail_tokens',
|
||||
'accountId',
|
||||
{
|
||||
type: Sequelize.DataTypes.UUID,
|
||||
|
||||
references: {
|
||||
model: 'accounts',
|
||||
key: 'id',
|
||||
},
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('secure_gmail_tokens', 'accountId', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
47
backend/src/db/migrations/1746185832219.js
Normal file
47
backend/src/db/migrations/1746185832219.js
Normal file
@ -0,0 +1,47 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'contacts',
|
||||
'email',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('contacts', 'email', { transaction });
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
49
backend/src/db/migrations/1746185880620.js
Normal file
49
backend/src/db/migrations/1746185880620.js
Normal file
@ -0,0 +1,49 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'contacts',
|
||||
'first_name',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('contacts', 'first_name', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
49
backend/src/db/migrations/1746185913392.js
Normal file
49
backend/src/db/migrations/1746185913392.js
Normal file
@ -0,0 +1,49 @@
|
||||
module.exports = {
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async up(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.addColumn(
|
||||
'contacts',
|
||||
'last_name',
|
||||
{
|
||||
type: Sequelize.DataTypes.TEXT,
|
||||
},
|
||||
{ transaction },
|
||||
);
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
/**
|
||||
* @param {QueryInterface} queryInterface
|
||||
* @param {Sequelize} Sequelize
|
||||
* @returns {Promise<void>}
|
||||
*/
|
||||
async down(queryInterface, Sequelize) {
|
||||
/**
|
||||
* @type {Transaction}
|
||||
*/
|
||||
const transaction = await queryInterface.sequelize.transaction();
|
||||
try {
|
||||
await queryInterface.removeColumn('contacts', 'last_name', {
|
||||
transaction,
|
||||
});
|
||||
|
||||
await transaction.commit();
|
||||
} catch (err) {
|
||||
await transaction.rollback();
|
||||
throw err;
|
||||
}
|
||||
},
|
||||
};
|
||||
81
backend/src/db/models/accounts.js
Normal file
81
backend/src/db/models/accounts.js
Normal file
@ -0,0 +1,81 @@
|
||||
const config = require('../../config');
|
||||
const providers = config.providers;
|
||||
const crypto = require('crypto');
|
||||
const bcrypt = require('bcrypt');
|
||||
const moment = require('moment');
|
||||
|
||||
module.exports = function (sequelize, DataTypes) {
|
||||
const accounts = sequelize.define(
|
||||
'accounts',
|
||||
{
|
||||
id: {
|
||||
type: DataTypes.UUID,
|
||||
defaultValue: DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
|
||||
name: {
|
||||
type: DataTypes.TEXT,
|
||||
},
|
||||
|
||||
importHash: {
|
||||
type: DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{
|
||||
timestamps: true,
|
||||
paranoid: true,
|
||||
freezeTableName: true,
|
||||
},
|
||||
);
|
||||
|
||||
accounts.associate = (db) => {
|
||||
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
|
||||
|
||||
db.accounts.hasMany(db.contacts, {
|
||||
as: 'contacts_account',
|
||||
foreignKey: {
|
||||
name: 'accountId',
|
||||
},
|
||||
constraints: false,
|
||||
});
|
||||
|
||||
db.accounts.hasMany(db.contact_lists, {
|
||||
as: 'contact_lists_account',
|
||||
foreignKey: {
|
||||
name: 'accountId',
|
||||
},
|
||||
constraints: false,
|
||||
});
|
||||
|
||||
db.accounts.hasMany(db.secure_gmail_tokens, {
|
||||
as: 'secure_gmail_tokens_account',
|
||||
foreignKey: {
|
||||
name: 'accountId',
|
||||
},
|
||||
constraints: false,
|
||||
});
|
||||
|
||||
//end loop
|
||||
|
||||
db.accounts.belongsTo(db.users, {
|
||||
as: 'user',
|
||||
foreignKey: {
|
||||
name: 'userId',
|
||||
},
|
||||
constraints: false,
|
||||
});
|
||||
|
||||
db.accounts.belongsTo(db.users, {
|
||||
as: 'createdBy',
|
||||
});
|
||||
|
||||
db.accounts.belongsTo(db.users, {
|
||||
as: 'updatedBy',
|
||||
});
|
||||
};
|
||||
|
||||
return accounts;
|
||||
};
|
||||
63
backend/src/db/models/agents.js
Normal file
63
backend/src/db/models/agents.js
Normal file
@ -0,0 +1,63 @@
|
||||
const config = require('../../config');
|
||||
const providers = config.providers;
|
||||
const crypto = require('crypto');
|
||||
const bcrypt = require('bcrypt');
|
||||
const moment = require('moment');
|
||||
|
||||
module.exports = function (sequelize, DataTypes) {
|
||||
const agents = sequelize.define(
|
||||
'agents',
|
||||
{
|
||||
id: {
|
||||
type: DataTypes.UUID,
|
||||
defaultValue: DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
|
||||
name: {
|
||||
type: DataTypes.TEXT,
|
||||
},
|
||||
|
||||
expertise: {
|
||||
type: DataTypes.TEXT,
|
||||
},
|
||||
|
||||
purpose: {
|
||||
type: DataTypes.TEXT,
|
||||
},
|
||||
|
||||
status: {
|
||||
type: DataTypes.ENUM,
|
||||
|
||||
values: ['active', 'inactive', 'development'],
|
||||
},
|
||||
|
||||
importHash: {
|
||||
type: DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{
|
||||
timestamps: true,
|
||||
paranoid: true,
|
||||
freezeTableName: true,
|
||||
},
|
||||
);
|
||||
|
||||
agents.associate = (db) => {
|
||||
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
|
||||
|
||||
//end loop
|
||||
|
||||
db.agents.belongsTo(db.users, {
|
||||
as: 'createdBy',
|
||||
});
|
||||
|
||||
db.agents.belongsTo(db.users, {
|
||||
as: 'updatedBy',
|
||||
});
|
||||
};
|
||||
|
||||
return agents;
|
||||
};
|
||||
45
backend/src/db/models/auto_reply_rules.js
Normal file
45
backend/src/db/models/auto_reply_rules.js
Normal file
@ -0,0 +1,45 @@
|
||||
const config = require('../../config');
|
||||
const providers = config.providers;
|
||||
const crypto = require('crypto');
|
||||
const bcrypt = require('bcrypt');
|
||||
const moment = require('moment');
|
||||
|
||||
module.exports = function (sequelize, DataTypes) {
|
||||
const auto_reply_rules = sequelize.define(
|
||||
'auto_reply_rules',
|
||||
{
|
||||
id: {
|
||||
type: DataTypes.UUID,
|
||||
defaultValue: DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
|
||||
importHash: {
|
||||
type: DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{
|
||||
timestamps: true,
|
||||
paranoid: true,
|
||||
freezeTableName: true,
|
||||
},
|
||||
);
|
||||
|
||||
auto_reply_rules.associate = (db) => {
|
||||
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
|
||||
|
||||
//end loop
|
||||
|
||||
db.auto_reply_rules.belongsTo(db.users, {
|
||||
as: 'createdBy',
|
||||
});
|
||||
|
||||
db.auto_reply_rules.belongsTo(db.users, {
|
||||
as: 'updatedBy',
|
||||
});
|
||||
};
|
||||
|
||||
return auto_reply_rules;
|
||||
};
|
||||
45
backend/src/db/models/contact_list_membership.js
Normal file
45
backend/src/db/models/contact_list_membership.js
Normal file
@ -0,0 +1,45 @@
|
||||
const config = require('../../config');
|
||||
const providers = config.providers;
|
||||
const crypto = require('crypto');
|
||||
const bcrypt = require('bcrypt');
|
||||
const moment = require('moment');
|
||||
|
||||
module.exports = function (sequelize, DataTypes) {
|
||||
const contact_list_membership = sequelize.define(
|
||||
'contact_list_membership',
|
||||
{
|
||||
id: {
|
||||
type: DataTypes.UUID,
|
||||
defaultValue: DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
|
||||
importHash: {
|
||||
type: DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{
|
||||
timestamps: true,
|
||||
paranoid: true,
|
||||
freezeTableName: true,
|
||||
},
|
||||
);
|
||||
|
||||
contact_list_membership.associate = (db) => {
|
||||
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
|
||||
|
||||
//end loop
|
||||
|
||||
db.contact_list_membership.belongsTo(db.users, {
|
||||
as: 'createdBy',
|
||||
});
|
||||
|
||||
db.contact_list_membership.belongsTo(db.users, {
|
||||
as: 'updatedBy',
|
||||
});
|
||||
};
|
||||
|
||||
return contact_list_membership;
|
||||
};
|
||||
53
backend/src/db/models/contact_lists.js
Normal file
53
backend/src/db/models/contact_lists.js
Normal file
@ -0,0 +1,53 @@
|
||||
const config = require('../../config');
|
||||
const providers = config.providers;
|
||||
const crypto = require('crypto');
|
||||
const bcrypt = require('bcrypt');
|
||||
const moment = require('moment');
|
||||
|
||||
module.exports = function (sequelize, DataTypes) {
|
||||
const contact_lists = sequelize.define(
|
||||
'contact_lists',
|
||||
{
|
||||
id: {
|
||||
type: DataTypes.UUID,
|
||||
defaultValue: DataTypes.UUIDV4,
|
||||
primaryKey: true,
|
||||
},
|
||||
|
||||
importHash: {
|
||||
type: DataTypes.STRING(255),
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
},
|
||||
},
|
||||
{
|
||||
timestamps: true,
|
||||
paranoid: true,
|
||||
freezeTableName: true,
|
||||
},
|
||||
);
|
||||
|
||||
contact_lists.associate = (db) => {
|
||||
/// loop through entities and it's fields, and if ref === current e[name] and create relation has many on parent entity
|
||||
|
||||
//end loop
|
||||
|
||||
db.contact_lists.belongsTo(db.accounts, {
|
||||
as: 'account',
|
||||
foreignKey: {
|
||||
name: 'accountId',
|
||||
},
|
||||
constraints: false,
|
||||
});
|
||||
|
||||
db.contact_lists.belongsTo(db.users, {
|
||||
as: 'createdBy',
|
||||
});
|
||||
|
||||
db.contact_lists.belongsTo(db.users, {
|
||||
as: 'updatedBy',
|
||||
});
|
||||
};
|
||||
|
||||
return contact_lists;
|
||||
};
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user