Compare commits

..

42 Commits

Author SHA1 Message Date
DerLinkman
f09c8534f5 Merge branch 'master' into feature/fts-xapian 2023-03-15 16:13:51 +01:00
DerLinkman
0408bbf57f Merge branch 'feature/bootstrap5' into feature/fts-xapian 2022-11-23 12:00:15 +01:00
DerLinkman
eb4a33dc27 Implemented current state of UI (BS5) 2022-10-19 13:41:33 +02:00
DerLinkman
a0ae59f8bf Added 1.20-Xapian Image 2022-10-07 17:19:15 +02:00
DerLinkman
60871fa7d0 Merge remote-tracking branch 'origin/staging' into feature/fts-xapian 2022-10-07 17:06:12 +02:00
DerLinkman
59a4031e24 Merge branch 'staging' into feature/fts-xapian 2022-10-07 17:03:28 +02:00
DerLinkman
04f55fc748 Updated newer Compenent Versions 2022-10-07 17:01:32 +02:00
DerLinkman
3ba3f1c2bd Updated Dovecot Configs to be up to date 2022-09-30 20:37:48 +02:00
DerLinkman
b0fd9787b5 Enabled Substring Search for flatcurve 2022-07-20 15:49:43 +02:00
DerLinkman
df3de09050 Multi Build Dovecot Building + Config Optimizations 2022-07-18 18:35:22 +02:00
DerLinkman
86079429b3 Using Stable Dovecot Builds for flatcurve 2022-07-18 14:24:46 +02:00
DerLinkman
ed476aae6b Removed Solr Leftovers + renamed flatcurve conf 2022-07-18 11:15:40 +02:00
moo
f0e27312f9 Restored manual Dovecot Build 2022-05-12 11:11:13 +02:00
moo
3425bcfbf0 Renamed, reconfigured needed Xapian files for Flatcurve 2022-04-29 14:35:54 +02:00
moo
bfa81b318d Changed Build process to fts-flatcurve Version of Xapian FTS 2022-04-29 14:28:49 +02:00
Niklas Meyer
8dba0ca7dd Merge branch 'staging' into feature/fts-xapian 2022-03-02 17:07:18 +01:00
Niklas Meyer
5bd3394ed9 Delete decode2text.sh 2022-02-23 16:12:15 +01:00
Niklas Meyer
c0e66254b9 Cleanup Dockerfile + Added new dependencies 2022-02-23 09:45:28 +01:00
Niklas Meyer
aec2dd1252 Merge pull request #4455 from DerLinkman/Dovecot-FTS-XAPIAN
[XAPIAN] Added Solr Replacement [BETA]
2022-02-07 08:39:17 +01:00
Niklas Meyer
d86e9a22f4 Fetch Staging from orig Repo (#3)
* [Web] add github version tag

* [Web] add github version tag

* [Web] add github version tag

* [Web] add github version tag

* [Web] add github version tag

* [Web] add github version tag error handling

* [Web] add github version tag error handling

* Passwordless SOGo auth: support for calendar invitations and calendar/contacts subscriptions

Inviting someone to a calendar event triggers a request to /SOGo/so/otheruser@example.com/freebusy.ifb/ajaxRead. Subscribing to someone's calendar/contacts triggers a request to /SOGo/so/otheruser@example.com/foldersSearch. The email address in the URL is different from the logged-in user, which needs to be handled appropriately by sogo-auth.php.

* [Web] add github version tag - adjust css

* [Compose] Update SOGo Autoreply Schedule to 5m

Based on the advice of inverse (SOGo developer). Thanks to https://github.com/jmber

Closes: https://github.com/mailcow/mailcow-dockerized/issues/4436

* [Web] add github version tag - move twig globals

* [Web] add github version tag - missing </div>

* Passwordless SOGo auth: improvements for when accessing other users

* [WebAuthn] fido2 passwordless auth - fix (#4440)

* [WebAuthn] fido2 revert

* [WebAuthn] set UV flags to 'discouraged'

* [WebAuthn] revert - set UV flags to 'discouraged'

* Update clamav to 0.104.2

* Update clamav to 0.104.2

* Update dovecot to 2.3.18

Update gosu to 1.14
Use debian bullseye as base

* [Web] Updated lang.es.json [CI SKIP] (#4453)

Co-authored-by: Fijxu <fijxu@zzls.xyz>
Co-authored-by: milkmaker <milkmaker@mailcow.de>

Co-authored-by: Fijxu <fijxu@zzls.xyz>

Co-authored-by: FreddleSpl0it <patschul@posteo.de>
Co-authored-by: FreddleSpl0it <75116288+FreddleSpl0it@users.noreply.github.com>
Co-authored-by: Michael Kuron <mkuron@users.noreply.github.com>
Co-authored-by: Peter <magic@kthx.at>
Co-authored-by: milkmaker <milkmaker@mailcow.de>
Co-authored-by: Fijxu <fijxu@zzls.xyz>
2022-02-07 08:28:10 +01:00
Niklas Meyer
db73f83c4e Merge pull request #2 from mailcow/master
Jan(moo)uary Update 2022 - Revision A (2022-01a) (#4445)
2022-02-07 08:21:31 +01:00
Niklas Meyer
28582c5842 [Compose] Changed Ofelia CMD for fts optimize 2022-02-01 14:50:19 +01:00
Niklas Meyer
3d637aca25 [Dovecot] Added Xapian Parameters from mailcow.conf 2022-02-01 14:48:46 +01:00
Niklas Meyer
d079ff49c6 Update syslog-ng.conf 2022-01-28 17:21:20 +01:00
Niklas Meyer
2d6ce926e1 Update syslog-ng-redis_slave.conf 2022-01-28 17:21:07 +01:00
Niklas Meyer
60ddfe3be2 [Dovecot] Added Xapian include 2022-01-28 17:07:12 +01:00
Niklas Meyer
30e2d944cd [Dovecot] Removed Xapian from dovecot.conf
This added a include try pointing on the file instead.
2022-01-28 16:51:33 +01:00
Niklas Meyer
99ea569288 [Dovecot] Added seperate XAPIAN Conf 2022-01-28 16:49:58 +01:00
Niklas Meyer
c98ef0d0c5 Delete FTS-Xapian.conf 2022-01-28 16:49:30 +01:00
Niklas Meyer
f09ca0a36a [Dovecot] Added seperate XAPIAN Conf 2022-01-28 16:48:51 +01:00
Niklas Meyer
cdce97bd59 [Dovecot] Changed Xapian default to 1024m instead of 2G 2022-01-28 15:03:23 +01:00
Niklas Meyer
ed8941440a [Dovecot] Added Xapian default config 2022-01-28 12:27:57 +01:00
Niklas Meyer
570170a5b1 [Compose] Remove solr from ipv6-nat dependencies 2022-01-28 12:24:51 +01:00
Niklas Meyer
df2c33d323 [Compose] Replace solr to Xapian (in dovecot)
First revision. Waiting on: https://github.com/grosjo/fts-xapian/issues/115
2022-01-28 11:19:38 +01:00
Niklas Meyer
f2e0e50f87 [Config] Readded default Value for Xapian Heap 2022-01-28 11:01:31 +01:00
Niklas Meyer
26c5ed73e2 [Config] Replace Solr with Xapian (Remove Solr Binds) 2022-01-28 10:45:33 +01:00
Niklas Meyer
148b511f9d [Update.sh] Replace Solr with Xapian 2022-01-28 10:44:08 +01:00
Niklas Meyer
311007700b [Dovecot] Add decode2text.sh 2022-01-27 15:36:03 +01:00
Niklas Meyer
3a9177bd4c Merge branch 'mailcow:master' into Dovecot-FTS-XAPIAN 2022-01-27 08:24:43 +01:00
Niklas Meyer
bca09e3afa [Dovecot] Rebase on Bullseye + Xapian Compile
This Push adds the bullseye rebase + the compilation of the XAPIAN Core + Plugin to run with Dovecot 2.3.17
2022-01-27 08:24:28 +01:00
Peter
cfba96f7e0 [GH-Actions][stale] Add neverstale label to exempt list 2022-01-22 17:41:46 +01:00
ntimo
c82f38a025 [API] Fix minor issue in api docs 2022-01-21 21:29:16 +00:00
76 changed files with 1060 additions and 1614 deletions

View File

@@ -14,7 +14,7 @@ jobs:
pull-requests: write
steps:
- name: Mark/Close Stale Issues and Pull Requests 🗑️
uses: actions/stale@v8.0.0
uses: actions/stale@v7.0.0
with:
repo-token: ${{ secrets.STALE_ACTION_PAT }}
days-before-stale: 60

View File

@@ -1,39 +0,0 @@
name: Update postscreen_access.cidr
on:
schedule:
# Monthly
- cron: "0 0 1 * *"
workflow_dispatch: # Allow to run workflow manually
permissions:
contents: read # to fetch code (actions/checkout)
jobs:
Update-postscreen_access_cidr:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Generate postscreen_access.cidr
run: |
bash helper-scripts/update_postscreen_whitelist.sh
- name: Create Pull Request
uses: peter-evans/create-pull-request@v5
with:
token: ${{ secrets.mailcow_action_Update_postscreen_access_cidr_pat }}
commit-message: update postscreen_access.cidr
committer: milkmaker <milkmaker@mailcow.de>
author: milkmaker <milkmaker@mailcow.de>
signoff: false
branch: update/postscreen_access.cidr
base: staging
delete-branch: true
add-paths: |
data/conf/postfix/postscreen_access.cidr
title: '[Postfix] update postscreen_access.cidr'
body: |
This PR updates the postscreen_access.cidr using GitHub Actions and [helper-scripts/update_postscreen_whitelist.sh](https://github.com/mailcow/mailcow-dockerized/blob/master/helper-scripts/update_postscreen_whitelist.sh)

1
.gitignore vendored
View File

@@ -13,6 +13,7 @@ data/conf/dovecot/acl_anyone
data/conf/dovecot/dovecot-master.passwd
data/conf/dovecot/dovecot-master.userdb
data/conf/dovecot/extra.conf
data/conf/dovecot/dovecot-fts-flatcurve.conf
data/conf/dovecot/global_sieve_*
data/conf/dovecot/last_login
data/conf/dovecot/lua

View File

@@ -1,6 +1,6 @@
FROM alpine:3.17
LABEL maintainer "The Infrastructure Company GmbH <info@servercow.de>"
LABEL maintainer "Andre Peters <andre.peters@servercow.de>"
RUN apk upgrade --no-cache \
&& apk add --update --no-cache \

View File

@@ -1,6 +1,6 @@
FROM alpine:3.17
LABEL maintainer "The Infrastructure Company GmbH <info@servercow.de>"
LABEL maintainer "Andre Peters <andre.peters@servercow.de>"
WORKDIR /app
@@ -14,12 +14,9 @@ RUN apk add --update --no-cache python3 \
uvicorn \
aiodocker \
docker \
aioredis
RUN mkdir /app/modules
redis
COPY docker-entrypoint.sh /app/
COPY main.py /app/main.py
COPY modules/ /app/modules/
COPY dockerapi.py /app/
ENTRYPOINT ["/bin/sh", "/app/docker-entrypoint.sh"]
CMD exec python main.py

View File

@@ -6,4 +6,4 @@
-subj /CN=dockerapi/O=mailcow \
-addext subjectAltName=DNS:dockerapi`
exec "$@"
`uvicorn --host 0.0.0.0 --port 443 --ssl-certfile=/app/dockerapi_cert.pem --ssl-keyfile=/app/dockerapi_key.pem dockerapi:app`

View File

@@ -0,0 +1,539 @@
from fastapi import FastAPI, Response, Request
import aiodocker
import docker
import psutil
import sys
import re
import time
import os
import json
import asyncio
import redis
from datetime import datetime
import logging
from logging.config import dictConfig
log_config = {
"version": 1,
"disable_existing_loggers": False,
"formatters": {
"default": {
"()": "uvicorn.logging.DefaultFormatter",
"fmt": "%(levelprefix)s %(asctime)s %(message)s",
"datefmt": "%Y-%m-%d %H:%M:%S",
},
},
"handlers": {
"default": {
"formatter": "default",
"class": "logging.StreamHandler",
"stream": "ext://sys.stderr",
},
},
"loggers": {
"api-logger": {"handlers": ["default"], "level": "INFO"},
},
}
dictConfig(log_config)
containerIds_to_update = []
host_stats_isUpdating = False
app = FastAPI()
logger = logging.getLogger('api-logger')
@app.get("/host/stats")
async def get_host_update_stats():
global host_stats_isUpdating
if host_stats_isUpdating == False:
asyncio.create_task(get_host_stats())
host_stats_isUpdating = True
while True:
if redis_client.exists('host_stats'):
break
await asyncio.sleep(1.5)
stats = json.loads(redis_client.get('host_stats'))
return Response(content=json.dumps(stats, indent=4), media_type="application/json")
@app.get("/containers/{container_id}/json")
async def get_container(container_id : str):
if container_id and container_id.isalnum():
try:
for container in (await async_docker_client.containers.list()):
if container._id == container_id:
container_info = await container.show()
return Response(content=json.dumps(container_info, indent=4), media_type="application/json")
res = {
"type": "danger",
"msg": "no container found"
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
except Exception as e:
res = {
"type": "danger",
"msg": str(e)
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = {
"type": "danger",
"msg": "no or invalid id defined"
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
@app.get("/containers/json")
async def get_containers():
containers = {}
try:
for container in (await async_docker_client.containers.list()):
container_info = await container.show()
containers.update({container_info['Id']: container_info})
return Response(content=json.dumps(containers, indent=4), media_type="application/json")
except Exception as e:
res = {
"type": "danger",
"msg": str(e)
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
@app.post("/containers/{container_id}/{post_action}")
async def post_containers(container_id : str, post_action : str, request: Request):
try :
request_json = await request.json()
except Exception as err:
request_json = {}
if container_id and container_id.isalnum() and post_action:
try:
"""Dispatch container_post api call"""
if post_action == 'exec':
if not request_json or not 'cmd' in request_json:
res = {
"type": "danger",
"msg": "cmd is missing"
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
if not request_json or not 'task' in request_json:
res = {
"type": "danger",
"msg": "task is missing"
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
api_call_method_name = '__'.join(['container_post', str(post_action), str(request_json['cmd']), str(request_json['task']) ])
else:
api_call_method_name = '__'.join(['container_post', str(post_action) ])
docker_utils = DockerUtils(sync_docker_client)
api_call_method = getattr(docker_utils, api_call_method_name, lambda container_id: Response(content=json.dumps({'type': 'danger', 'msg':'container_post - unknown api call' }, indent=4), media_type="application/json"))
logger.info("api call: %s, container_id: %s" % (api_call_method_name, container_id))
return api_call_method(container_id, request_json)
except Exception as e:
logger.error("error - container_post: %s" % str(e))
res = {
"type": "danger",
"msg": str(e)
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = {
"type": "danger",
"msg": "invalid container id or missing action"
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
@app.post("/container/{container_id}/stats/update")
async def post_container_update_stats(container_id : str):
global containerIds_to_update
# start update task for container if no task is running
if container_id not in containerIds_to_update:
asyncio.create_task(get_container_stats(container_id))
containerIds_to_update.append(container_id)
while True:
if redis_client.exists(container_id + '_stats'):
break
await asyncio.sleep(1.5)
stats = json.loads(redis_client.get(container_id + '_stats'))
return Response(content=json.dumps(stats, indent=4), media_type="application/json")
class DockerUtils:
def __init__(self, docker_client):
self.docker_client = docker_client
# api call: container_post - post_action: stop
def container_post__stop(self, container_id, request_json):
for container in self.docker_client.containers.list(all=True, filters={"id": container_id}):
container.stop()
res = { 'type': 'success', 'msg': 'command completed successfully'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: start
def container_post__start(self, container_id, request_json):
for container in self.docker_client.containers.list(all=True, filters={"id": container_id}):
container.start()
res = { 'type': 'success', 'msg': 'command completed successfully'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: restart
def container_post__restart(self, container_id, request_json):
for container in self.docker_client.containers.list(all=True, filters={"id": container_id}):
container.restart()
res = { 'type': 'success', 'msg': 'command completed successfully'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: top
def container_post__top(self, container_id, request_json):
for container in self.docker_client.containers.list(all=True, filters={"id": container_id}):
res = { 'type': 'success', 'msg': container.top()}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: stats
def container_post__stats(self, container_id, request_json):
for container in self.docker_client.containers.list(all=True, filters={"id": container_id}):
for stat in container.stats(decode=True, stream=True):
res = { 'type': 'success', 'msg': stat}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: exec - cmd: mailq - task: delete
def container_post__exec__mailq__delete(self, container_id, request_json):
if 'items' in request_json:
r = re.compile("^[0-9a-fA-F]+$")
filtered_qids = filter(r.match, request_json['items'])
if filtered_qids:
flagged_qids = ['-d %s' % i for i in filtered_qids]
sanitized_string = str(' '.join(flagged_qids));
for container in self.docker_client.containers.list(filters={"id": container_id}):
postsuper_r = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postsuper " + sanitized_string])
return exec_run_handler('generic', postsuper_r)
# api call: container_post - post_action: exec - cmd: mailq - task: hold
def container_post__exec__mailq__hold(self, container_id, request_json):
if 'items' in request_json:
r = re.compile("^[0-9a-fA-F]+$")
filtered_qids = filter(r.match, request_json['items'])
if filtered_qids:
flagged_qids = ['-h %s' % i for i in filtered_qids]
sanitized_string = str(' '.join(flagged_qids));
for container in self.docker_client.containers.list(filters={"id": container_id}):
postsuper_r = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postsuper " + sanitized_string])
return exec_run_handler('generic', postsuper_r)
# api call: container_post - post_action: exec - cmd: mailq - task: cat
def container_post__exec__mailq__cat(self, container_id, request_json):
if 'items' in request_json:
r = re.compile("^[0-9a-fA-F]+$")
filtered_qids = filter(r.match, request_json['items'])
if filtered_qids:
sanitized_string = str(' '.join(filtered_qids));
for container in self.docker_client.containers.list(filters={"id": container_id}):
postcat_return = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postcat -q " + sanitized_string], user='postfix')
if not postcat_return:
postcat_return = 'err: invalid'
return exec_run_handler('utf8_text_only', postcat_return)
# api call: container_post - post_action: exec - cmd: mailq - task: unhold
def container_post__exec__mailq__unhold(self, container_id, request_json):
if 'items' in request_json:
r = re.compile("^[0-9a-fA-F]+$")
filtered_qids = filter(r.match, request_json['items'])
if filtered_qids:
flagged_qids = ['-H %s' % i for i in filtered_qids]
sanitized_string = str(' '.join(flagged_qids));
for container in self.docker_client.containers.list(filters={"id": container_id}):
postsuper_r = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postsuper " + sanitized_string])
return exec_run_handler('generic', postsuper_r)
# api call: container_post - post_action: exec - cmd: mailq - task: deliver
def container_post__exec__mailq__deliver(self, container_id, request_json):
if 'items' in request_json:
r = re.compile("^[0-9a-fA-F]+$")
filtered_qids = filter(r.match, request_json['items'])
if filtered_qids:
flagged_qids = ['-i %s' % i for i in filtered_qids]
for container in self.docker_client.containers.list(filters={"id": container_id}):
for i in flagged_qids:
postqueue_r = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postqueue " + i], user='postfix')
# todo: check each exit code
res = { 'type': 'success', 'msg': 'Scheduled immediate delivery'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: exec - cmd: mailq - task: list
def container_post__exec__mailq__list(self, container_id, request_json):
for container in self.docker_client.containers.list(filters={"id": container_id}):
mailq_return = container.exec_run(["/usr/sbin/postqueue", "-j"], user='postfix')
return exec_run_handler('utf8_text_only', mailq_return)
# api call: container_post - post_action: exec - cmd: mailq - task: flush
def container_post__exec__mailq__flush(self, container_id, request_json):
for container in self.docker_client.containers.list(filters={"id": container_id}):
postqueue_r = container.exec_run(["/usr/sbin/postqueue", "-f"], user='postfix')
return exec_run_handler('generic', postqueue_r)
# api call: container_post - post_action: exec - cmd: mailq - task: super_delete
def container_post__exec__mailq__super_delete(self, container_id, request_json):
for container in self.docker_client.containers.list(filters={"id": container_id}):
postsuper_r = container.exec_run(["/usr/sbin/postsuper", "-d", "ALL"])
return exec_run_handler('generic', postsuper_r)
# api call: container_post - post_action: exec - cmd: system - task: fts_rescan
def container_post__exec__system__fts_rescan(self, container_id, request_json):
if 'username' in request_json:
for container in self.docker_client.containers.list(filters={"id": container_id}):
rescan_return = container.exec_run(["/bin/bash", "-c", "/usr/bin/doveadm fts rescan -u '" + request_json['username'].replace("'", "'\\''") + "'"], user='vmail')
if rescan_return.exit_code == 0:
res = { 'type': 'success', 'msg': 'fts_rescan: rescan triggered'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = { 'type': 'warning', 'msg': 'fts_rescan error'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
if 'all' in request_json:
for container in self.docker_client.containers.list(filters={"id": container_id}):
rescan_return = container.exec_run(["/bin/bash", "-c", "/usr/bin/doveadm fts rescan -A"], user='vmail')
if rescan_return.exit_code == 0:
res = { 'type': 'success', 'msg': 'fts_rescan: rescan triggered'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = { 'type': 'warning', 'msg': 'fts_rescan error'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: exec - cmd: system - task: df
def container_post__exec__system__df(self, container_id, request_json):
if 'dir' in request_json:
for container in self.docker_client.containers.list(filters={"id": container_id}):
df_return = container.exec_run(["/bin/bash", "-c", "/bin/df -H '" + request_json['dir'].replace("'", "'\\''") + "' | /usr/bin/tail -n1 | /usr/bin/tr -s [:blank:] | /usr/bin/tr ' ' ','"], user='nobody')
if df_return.exit_code == 0:
return df_return.output.decode('utf-8').rstrip()
else:
return "0,0,0,0,0,0"
# api call: container_post - post_action: exec - cmd: system - task: mysql_upgrade
def container_post__exec__system__mysql_upgrade(self, container_id, request_json):
for container in self.docker_client.containers.list(filters={"id": container_id}):
sql_return = container.exec_run(["/bin/bash", "-c", "/usr/bin/mysql_upgrade -uroot -p'" + os.environ['DBROOT'].replace("'", "'\\''") + "'\n"], user='mysql')
if sql_return.exit_code == 0:
matched = False
for line in sql_return.output.decode('utf-8').split("\n"):
if 'is already upgraded to' in line:
matched = True
if matched:
res = { 'type': 'success', 'msg':'mysql_upgrade: already upgraded', 'text': sql_return.output.decode('utf-8')}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
container.restart()
res = { 'type': 'warning', 'msg':'mysql_upgrade: upgrade was applied', 'text': sql_return.output.decode('utf-8')}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = { 'type': 'error', 'msg': 'mysql_upgrade: error running command', 'text': sql_return.output.decode('utf-8')}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: exec - cmd: system - task: mysql_tzinfo_to_sql
def container_post__exec__system__mysql_tzinfo_to_sql(self, container_id, request_json):
for container in self.docker_client.containers.list(filters={"id": container_id}):
sql_return = container.exec_run(["/bin/bash", "-c", "/usr/bin/mysql_tzinfo_to_sql /usr/share/zoneinfo | /bin/sed 's/Local time zone must be set--see zic manual page/FCTY/' | /usr/bin/mysql -uroot -p'" + os.environ['DBROOT'].replace("'", "'\\''") + "' mysql \n"], user='mysql')
if sql_return.exit_code == 0:
res = { 'type': 'info', 'msg': 'mysql_tzinfo_to_sql: command completed successfully', 'text': sql_return.output.decode('utf-8')}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = { 'type': 'error', 'msg': 'mysql_tzinfo_to_sql: error running command', 'text': sql_return.output.decode('utf-8')}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: exec - cmd: reload - task: dovecot
def container_post__exec__reload__dovecot(self, container_id, request_json):
for container in self.docker_client.containers.list(filters={"id": container_id}):
reload_return = container.exec_run(["/bin/bash", "-c", "/usr/sbin/dovecot reload"])
return exec_run_handler('generic', reload_return)
# api call: container_post - post_action: exec - cmd: reload - task: postfix
def container_post__exec__reload__postfix(self, container_id, request_json):
for container in self.docker_client.containers.list(filters={"id": container_id}):
reload_return = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postfix reload"])
return exec_run_handler('generic', reload_return)
# api call: container_post - post_action: exec - cmd: reload - task: nginx
def container_post__exec__reload__nginx(self, container_id, request_json):
for container in self.docker_client.containers.list(filters={"id": container_id}):
reload_return = container.exec_run(["/bin/sh", "-c", "/usr/sbin/nginx -s reload"])
return exec_run_handler('generic', reload_return)
# api call: container_post - post_action: exec - cmd: sieve - task: list
def container_post__exec__sieve__list(self, container_id, request_json):
if 'username' in request_json:
for container in self.docker_client.containers.list(filters={"id": container_id}):
sieve_return = container.exec_run(["/bin/bash", "-c", "/usr/bin/doveadm sieve list -u '" + request_json['username'].replace("'", "'\\''") + "'"])
return exec_run_handler('utf8_text_only', sieve_return)
# api call: container_post - post_action: exec - cmd: sieve - task: print
def container_post__exec__sieve__print(self, container_id, request_json):
if 'username' in request.json and 'script_name' in request_json:
for container in self.docker_client.containers.list(filters={"id": container_id}):
cmd = ["/bin/bash", "-c", "/usr/bin/doveadm sieve get -u '" + request_json['username'].replace("'", "'\\''") + "' '" + request_json['script_name'].replace("'", "'\\''") + "'"]
sieve_return = container.exec_run(cmd)
return exec_run_handler('utf8_text_only', sieve_return)
# api call: container_post - post_action: exec - cmd: maildir - task: cleanup
def container_post__exec__maildir__cleanup(self, container_id, request_json):
if 'maildir' in request_json:
for container in self.docker_client.containers.list(filters={"id": container_id}):
sane_name = re.sub(r'\W+', '', request_json['maildir'])
cmd = ["/bin/bash", "-c", "if [[ -d '/var/vmail/" + request_json['maildir'].replace("'", "'\\''") + "' ]]; then /bin/mv '/var/vmail/" + request_json['maildir'].replace("'", "'\\''") + "' '/var/vmail/_garbage/" + str(int(time.time())) + "_" + sane_name + "'; fi"]
maildir_cleanup = container.exec_run(cmd, user='vmail')
return exec_run_handler('generic', maildir_cleanup)
# api call: container_post - post_action: exec - cmd: rspamd - task: worker_password
def container_post__exec__rspamd__worker_password(self, container_id, request_json):
if 'raw' in request_json:
for container in self.docker_client.containers.list(filters={"id": container_id}):
cmd = "/usr/bin/rspamadm pw -e -p '" + request_json['raw'].replace("'", "'\\''") + "' 2> /dev/null"
cmd_response = exec_cmd_container(container, cmd, user="_rspamd")
matched = False
for line in cmd_response.split("\n"):
if '$2$' in line:
hash = line.strip()
hash_out = re.search('\$2\$.+$', hash).group(0)
rspamd_passphrase_hash = re.sub('[^0-9a-zA-Z\$]+', '', hash_out.rstrip())
rspamd_password_filename = "/etc/rspamd/override.d/worker-controller-password.inc"
cmd = '''/bin/echo 'enable_password = "%s";' > %s && cat %s''' % (rspamd_passphrase_hash, rspamd_password_filename, rspamd_password_filename)
cmd_response = exec_cmd_container(container, cmd, user="_rspamd")
if rspamd_passphrase_hash.startswith("$2$") and rspamd_passphrase_hash in cmd_response:
container.restart()
matched = True
if matched:
res = { 'type': 'success', 'msg': 'command completed successfully' }
logger.info('success changing Rspamd password')
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
logger.error('failed changing Rspamd password')
res = { 'type': 'danger', 'msg': 'command did not complete' }
return Response(content=json.dumps(res, indent=4), media_type="application/json")
def exec_cmd_container(container, cmd, user, timeout=2, shell_cmd="/bin/bash"):
def recv_socket_data(c_socket, timeout):
c_socket.setblocking(0)
total_data=[]
data=''
begin=time.time()
while True:
if total_data and time.time()-begin > timeout:
break
elif time.time()-begin > timeout*2:
break
try:
data = c_socket.recv(8192)
if data:
total_data.append(data.decode('utf-8'))
#change the beginning time for measurement
begin=time.time()
else:
#sleep for sometime to indicate a gap
time.sleep(0.1)
break
except:
pass
return ''.join(total_data)
try :
socket = container.exec_run([shell_cmd], stdin=True, socket=True, user=user).output._sock
if not cmd.endswith("\n"):
cmd = cmd + "\n"
socket.send(cmd.encode('utf-8'))
data = recv_socket_data(socket, timeout)
socket.close()
return data
except Exception as e:
logger.error("error - exec_cmd_container: %s" % str(e))
traceback.print_exc(file=sys.stdout)
def exec_run_handler(type, output):
if type == 'generic':
if output.exit_code == 0:
res = { 'type': 'success', 'msg': 'command completed successfully' }
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = { 'type': 'danger', 'msg': 'command failed: ' + output.output.decode('utf-8') }
return Response(content=json.dumps(res, indent=4), media_type="application/json")
if type == 'utf8_text_only':
return Response(content=output.output.decode('utf-8'), media_type="text/plain")
async def get_host_stats(wait=5):
global host_stats_isUpdating
try:
system_time = datetime.now()
host_stats = {
"cpu": {
"cores": psutil.cpu_count(),
"usage": psutil.cpu_percent()
},
"memory": {
"total": psutil.virtual_memory().total,
"usage": psutil.virtual_memory().percent,
"swap": psutil.swap_memory()
},
"uptime": time.time() - psutil.boot_time(),
"system_time": system_time.strftime("%d.%m.%Y %H:%M:%S")
}
redis_client.set('host_stats', json.dumps(host_stats), ex=10)
except Exception as e:
res = {
"type": "danger",
"msg": str(e)
}
await asyncio.sleep(wait)
host_stats_isUpdating = False
async def get_container_stats(container_id, wait=5, stop=False):
global containerIds_to_update
if container_id and container_id.isalnum():
try:
for container in (await async_docker_client.containers.list()):
if container._id == container_id:
res = await container.stats(stream=False)
if redis_client.exists(container_id + '_stats'):
stats = json.loads(redis_client.get(container_id + '_stats'))
else:
stats = []
stats.append(res[0])
if len(stats) > 3:
del stats[0]
redis_client.set(container_id + '_stats', json.dumps(stats), ex=60)
except Exception as e:
res = {
"type": "danger",
"msg": str(e)
}
else:
res = {
"type": "danger",
"msg": "no or invalid id defined"
}
await asyncio.sleep(wait)
if stop == True:
# update task was called second time, stop
containerIds_to_update.remove(container_id)
else:
# call update task a second time
await get_container_stats(container_id, wait=0, stop=True)
if os.environ['REDIS_SLAVEOF_IP'] != "":
redis_client = redis.Redis(host=os.environ['REDIS_SLAVEOF_IP'], port=os.environ['REDIS_SLAVEOF_PORT'], db=0)
else:
redis_client = redis.Redis(host='redis-mailcow', port=6379, db=0)
sync_docker_client = docker.DockerClient(base_url='unix://var/run/docker.sock', version='auto')
async_docker_client = aiodocker.Docker(url='unix:///var/run/docker.sock')
logger.info('DockerApi started')

View File

@@ -1,260 +0,0 @@
import os
import sys
import uvicorn
import json
import uuid
import async_timeout
import asyncio
import aioredis
import aiodocker
import docker
import logging
from logging.config import dictConfig
from fastapi import FastAPI, Response, Request
from modules.DockerApi import DockerApi
dockerapi = None
app = FastAPI()
# Define Routes
@app.get("/host/stats")
async def get_host_update_stats():
global dockerapi
if dockerapi.host_stats_isUpdating == False:
asyncio.create_task(dockerapi.get_host_stats())
dockerapi.host_stats_isUpdating = True
while True:
if await dockerapi.redis_client.exists('host_stats'):
break
await asyncio.sleep(1.5)
stats = json.loads(await dockerapi.redis_client.get('host_stats'))
return Response(content=json.dumps(stats, indent=4), media_type="application/json")
@app.get("/containers/{container_id}/json")
async def get_container(container_id : str):
global dockerapi
if container_id and container_id.isalnum():
try:
for container in (await dockerapi.async_docker_client.containers.list()):
if container._id == container_id:
container_info = await container.show()
return Response(content=json.dumps(container_info, indent=4), media_type="application/json")
res = {
"type": "danger",
"msg": "no container found"
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
except Exception as e:
res = {
"type": "danger",
"msg": str(e)
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = {
"type": "danger",
"msg": "no or invalid id defined"
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
@app.get("/containers/json")
async def get_containers():
global dockerapi
containers = {}
try:
for container in (await dockerapi.async_docker_client.containers.list()):
container_info = await container.show()
containers.update({container_info['Id']: container_info})
return Response(content=json.dumps(containers, indent=4), media_type="application/json")
except Exception as e:
res = {
"type": "danger",
"msg": str(e)
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
@app.post("/containers/{container_id}/{post_action}")
async def post_containers(container_id : str, post_action : str, request: Request):
global dockerapi
try :
request_json = await request.json()
except Exception as err:
request_json = {}
if container_id and container_id.isalnum() and post_action:
try:
"""Dispatch container_post api call"""
if post_action == 'exec':
if not request_json or not 'cmd' in request_json:
res = {
"type": "danger",
"msg": "cmd is missing"
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
if not request_json or not 'task' in request_json:
res = {
"type": "danger",
"msg": "task is missing"
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
api_call_method_name = '__'.join(['container_post', str(post_action), str(request_json['cmd']), str(request_json['task']) ])
else:
api_call_method_name = '__'.join(['container_post', str(post_action) ])
api_call_method = getattr(dockerapi, api_call_method_name, lambda container_id: Response(content=json.dumps({'type': 'danger', 'msg':'container_post - unknown api call' }, indent=4), media_type="application/json"))
dockerapi.logger.info("api call: %s, container_id: %s" % (api_call_method_name, container_id))
return api_call_method(request_json, container_id=container_id)
except Exception as e:
dockerapi.logger.error("error - container_post: %s" % str(e))
res = {
"type": "danger",
"msg": str(e)
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = {
"type": "danger",
"msg": "invalid container id or missing action"
}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
@app.post("/container/{container_id}/stats/update")
async def post_container_update_stats(container_id : str):
global dockerapi
# start update task for container if no task is running
if container_id not in dockerapi.containerIds_to_update:
asyncio.create_task(dockerapi.get_container_stats(container_id))
dockerapi.containerIds_to_update.append(container_id)
while True:
if await dockerapi.redis_client.exists(container_id + '_stats'):
break
await asyncio.sleep(1.5)
stats = json.loads(await dockerapi.redis_client.get(container_id + '_stats'))
return Response(content=json.dumps(stats, indent=4), media_type="application/json")
# Events
@app.on_event("startup")
async def startup_event():
global dockerapi
# Initialize a custom logger
logger = logging.getLogger("dockerapi")
logger.setLevel(logging.INFO)
# Configure the logger to output logs to the terminal
handler = logging.StreamHandler()
handler.setLevel(logging.INFO)
formatter = logging.Formatter("%(levelname)s: %(message)s")
handler.setFormatter(formatter)
logger.addHandler(handler)
logger.info("Init APP")
# Init redis client
if os.environ['REDIS_SLAVEOF_IP'] != "":
redis_client = redis = await aioredis.from_url(f"redis://{os.environ['REDIS_SLAVEOF_IP']}:{os.environ['REDIS_SLAVEOF_PORT']}/0")
else:
redis_client = redis = await aioredis.from_url("redis://redis-mailcow:6379/0")
# Init docker clients
sync_docker_client = docker.DockerClient(base_url='unix://var/run/docker.sock', version='auto')
async_docker_client = aiodocker.Docker(url='unix:///var/run/docker.sock')
dockerapi = DockerApi(redis_client, sync_docker_client, async_docker_client, logger)
logger.info("Subscribe to redis channel")
# Subscribe to redis channel
dockerapi.pubsub = redis.pubsub()
await dockerapi.pubsub.subscribe("MC_CHANNEL")
asyncio.create_task(handle_pubsub_messages(dockerapi.pubsub))
@app.on_event("shutdown")
async def shutdown_event():
global dockerapi
# Close docker connections
dockerapi.sync_docker_client.close()
await dockerapi.async_docker_client.close()
# Close redis
await dockerapi.pubsub.unsubscribe("MC_CHANNEL")
await dockerapi.redis_client.close()
# PubSub Handler
async def handle_pubsub_messages(channel: aioredis.client.PubSub):
global dockerapi
while True:
try:
async with async_timeout.timeout(1):
message = await channel.get_message(ignore_subscribe_messages=True)
if message is not None:
# Parse message
data_json = json.loads(message['data'].decode('utf-8'))
dockerapi.logger.info(f"PubSub Received - {json.dumps(data_json)}")
# Handle api_call
if 'api_call' in data_json:
# api_call: container_post
if data_json['api_call'] == "container_post":
if 'post_action' in data_json and 'container_name' in data_json:
try:
"""Dispatch container_post api call"""
request_json = {}
if data_json['post_action'] == 'exec':
if 'request' in data_json:
request_json = data_json['request']
if 'cmd' in request_json:
if 'task' in request_json:
api_call_method_name = '__'.join(['container_post', str(data_json['post_action']), str(request_json['cmd']), str(request_json['task']) ])
else:
dockerapi.logger.error("api call: task missing")
else:
dockerapi.logger.error("api call: cmd missing")
else:
dockerapi.logger.error("api call: request missing")
else:
api_call_method_name = '__'.join(['container_post', str(data_json['post_action'])])
if api_call_method_name:
api_call_method = getattr(dockerapi, api_call_method_name)
if api_call_method:
dockerapi.logger.info("api call: %s, container_name: %s" % (api_call_method_name, data_json['container_name']))
api_call_method(request_json, container_name=data_json['container_name'])
else:
dockerapi.logger.error("api call not found: %s, container_name: %s" % (api_call_method_name, data_json['container_name']))
except Exception as e:
dockerapi.logger.error("container_post: %s" % str(e))
else:
dockerapi.logger.error("api call: missing container_name, post_action or request")
else:
dockerapi.logger.error("Unknwon PubSub recieved - %s" % json.dumps(data_json))
else:
dockerapi.logger.error("Unknwon PubSub recieved - %s" % json.dumps(data_json))
await asyncio.sleep(0.01)
except asyncio.TimeoutError:
pass
if __name__ == '__main__':
uvicorn.run(
app,
host="0.0.0.0",
port=443,
ssl_certfile="/app/dockerapi_cert.pem",
ssl_keyfile="/app/dockerapi_key.pem",
log_level="info",
loop="none"
)

View File

@@ -1,486 +0,0 @@
import psutil
import sys
import re
import time
import json
import asyncio
import platform
from datetime import datetime
from fastapi import FastAPI, Response, Request
class DockerApi:
def __init__(self, redis_client, sync_docker_client, async_docker_client, logger):
self.redis_client = redis_client
self.sync_docker_client = sync_docker_client
self.async_docker_client = async_docker_client
self.logger = logger
self.host_stats_isUpdating = False
self.containerIds_to_update = []
# api call: container_post - post_action: stop
def container_post__stop(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(all=True, filters=filters):
container.stop()
res = { 'type': 'success', 'msg': 'command completed successfully'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: start
def container_post__start(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(all=True, filters=filters):
container.start()
res = { 'type': 'success', 'msg': 'command completed successfully'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: restart
def container_post__restart(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(all=True, filters=filters):
container.restart()
res = { 'type': 'success', 'msg': 'command completed successfully'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: top
def container_post__top(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(all=True, filters=filters):
res = { 'type': 'success', 'msg': container.top()}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: stats
def container_post__stats(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(all=True, filters=filters):
for stat in container.stats(decode=True, stream=True):
res = { 'type': 'success', 'msg': stat}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: exec - cmd: mailq - task: delete
def container_post__exec__mailq__delete(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'items' in request_json:
r = re.compile("^[0-9a-fA-F]+$")
filtered_qids = filter(r.match, request_json['items'])
if filtered_qids:
flagged_qids = ['-d %s' % i for i in filtered_qids]
sanitized_string = str(' '.join(flagged_qids))
for container in self.sync_docker_client.containers.list(filters=filters):
postsuper_r = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postsuper " + sanitized_string])
return self.exec_run_handler('generic', postsuper_r)
# api call: container_post - post_action: exec - cmd: mailq - task: hold
def container_post__exec__mailq__hold(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'items' in request_json:
r = re.compile("^[0-9a-fA-F]+$")
filtered_qids = filter(r.match, request_json['items'])
if filtered_qids:
flagged_qids = ['-h %s' % i for i in filtered_qids]
sanitized_string = str(' '.join(flagged_qids))
for container in self.sync_docker_client.containers.list(filters=filters):
postsuper_r = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postsuper " + sanitized_string])
return self.exec_run_handler('generic', postsuper_r)
# api call: container_post - post_action: exec - cmd: mailq - task: cat
def container_post__exec__mailq__cat(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'items' in request_json:
r = re.compile("^[0-9a-fA-F]+$")
filtered_qids = filter(r.match, request_json['items'])
if filtered_qids:
sanitized_string = str(' '.join(filtered_qids))
for container in self.sync_docker_client.containers.list(filters=filters):
postcat_return = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postcat -q " + sanitized_string], user='postfix')
if not postcat_return:
postcat_return = 'err: invalid'
return self.exec_run_handler('utf8_text_only', postcat_return)
# api call: container_post - post_action: exec - cmd: mailq - task: unhold
def container_post__exec__mailq__unhold(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'items' in request_json:
r = re.compile("^[0-9a-fA-F]+$")
filtered_qids = filter(r.match, request_json['items'])
if filtered_qids:
flagged_qids = ['-H %s' % i for i in filtered_qids]
sanitized_string = str(' '.join(flagged_qids))
for container in self.sync_docker_client.containers.list(filters=filters):
postsuper_r = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postsuper " + sanitized_string])
return self.exec_run_handler('generic', postsuper_r)
# api call: container_post - post_action: exec - cmd: mailq - task: deliver
def container_post__exec__mailq__deliver(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'items' in request_json:
r = re.compile("^[0-9a-fA-F]+$")
filtered_qids = filter(r.match, request_json['items'])
if filtered_qids:
flagged_qids = ['-i %s' % i for i in filtered_qids]
for container in self.sync_docker_client.containers.list(filters=filters):
for i in flagged_qids:
postqueue_r = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postqueue " + i], user='postfix')
# todo: check each exit code
res = { 'type': 'success', 'msg': 'Scheduled immediate delivery'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: exec - cmd: mailq - task: list
def container_post__exec__mailq__list(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(filters=filters):
mailq_return = container.exec_run(["/usr/sbin/postqueue", "-j"], user='postfix')
return self.exec_run_handler('utf8_text_only', mailq_return)
# api call: container_post - post_action: exec - cmd: mailq - task: flush
def container_post__exec__mailq__flush(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(filters=filters):
postqueue_r = container.exec_run(["/usr/sbin/postqueue", "-f"], user='postfix')
return self.exec_run_handler('generic', postqueue_r)
# api call: container_post - post_action: exec - cmd: mailq - task: super_delete
def container_post__exec__mailq__super_delete(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(filters=filters):
postsuper_r = container.exec_run(["/usr/sbin/postsuper", "-d", "ALL"])
return self.exec_run_handler('generic', postsuper_r)
# api call: container_post - post_action: exec - cmd: system - task: fts_rescan
def container_post__exec__system__fts_rescan(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'username' in request_json:
for container in self.sync_docker_client.containers.list(filters=filters):
rescan_return = container.exec_run(["/bin/bash", "-c", "/usr/bin/doveadm fts rescan -u '" + request_json['username'].replace("'", "'\\''") + "'"], user='vmail')
if rescan_return.exit_code == 0:
res = { 'type': 'success', 'msg': 'fts_rescan: rescan triggered'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = { 'type': 'warning', 'msg': 'fts_rescan error'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
if 'all' in request_json:
for container in self.sync_docker_client.containers.list(filters=filters):
rescan_return = container.exec_run(["/bin/bash", "-c", "/usr/bin/doveadm fts rescan -A"], user='vmail')
if rescan_return.exit_code == 0:
res = { 'type': 'success', 'msg': 'fts_rescan: rescan triggered'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = { 'type': 'warning', 'msg': 'fts_rescan error'}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: exec - cmd: system - task: df
def container_post__exec__system__df(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'dir' in request_json:
for container in self.sync_docker_client.containers.list(filters=filters):
df_return = container.exec_run(["/bin/bash", "-c", "/bin/df -H '" + request_json['dir'].replace("'", "'\\''") + "' | /usr/bin/tail -n1 | /usr/bin/tr -s [:blank:] | /usr/bin/tr ' ' ','"], user='nobody')
if df_return.exit_code == 0:
return df_return.output.decode('utf-8').rstrip()
else:
return "0,0,0,0,0,0"
# api call: container_post - post_action: exec - cmd: system - task: mysql_upgrade
def container_post__exec__system__mysql_upgrade(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(filters=filters):
sql_return = container.exec_run(["/bin/bash", "-c", "/usr/bin/mysql_upgrade -uroot -p'" + os.environ['DBROOT'].replace("'", "'\\''") + "'\n"], user='mysql')
if sql_return.exit_code == 0:
matched = False
for line in sql_return.output.decode('utf-8').split("\n"):
if 'is already upgraded to' in line:
matched = True
if matched:
res = { 'type': 'success', 'msg':'mysql_upgrade: already upgraded', 'text': sql_return.output.decode('utf-8')}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
container.restart()
res = { 'type': 'warning', 'msg':'mysql_upgrade: upgrade was applied', 'text': sql_return.output.decode('utf-8')}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = { 'type': 'error', 'msg': 'mysql_upgrade: error running command', 'text': sql_return.output.decode('utf-8')}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: exec - cmd: system - task: mysql_tzinfo_to_sql
def container_post__exec__system__mysql_tzinfo_to_sql(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(filters=filters):
sql_return = container.exec_run(["/bin/bash", "-c", "/usr/bin/mysql_tzinfo_to_sql /usr/share/zoneinfo | /bin/sed 's/Local time zone must be set--see zic manual page/FCTY/' | /usr/bin/mysql -uroot -p'" + os.environ['DBROOT'].replace("'", "'\\''") + "' mysql \n"], user='mysql')
if sql_return.exit_code == 0:
res = { 'type': 'info', 'msg': 'mysql_tzinfo_to_sql: command completed successfully', 'text': sql_return.output.decode('utf-8')}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = { 'type': 'error', 'msg': 'mysql_tzinfo_to_sql: error running command', 'text': sql_return.output.decode('utf-8')}
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# api call: container_post - post_action: exec - cmd: reload - task: dovecot
def container_post__exec__reload__dovecot(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(filters=filters):
reload_return = container.exec_run(["/bin/bash", "-c", "/usr/sbin/dovecot reload"])
return self.exec_run_handler('generic', reload_return)
# api call: container_post - post_action: exec - cmd: reload - task: postfix
def container_post__exec__reload__postfix(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(filters=filters):
reload_return = container.exec_run(["/bin/bash", "-c", "/usr/sbin/postfix reload"])
return self.exec_run_handler('generic', reload_return)
# api call: container_post - post_action: exec - cmd: reload - task: nginx
def container_post__exec__reload__nginx(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
for container in self.sync_docker_client.containers.list(filters=filters):
reload_return = container.exec_run(["/bin/sh", "-c", "/usr/sbin/nginx -s reload"])
return self.exec_run_handler('generic', reload_return)
# api call: container_post - post_action: exec - cmd: sieve - task: list
def container_post__exec__sieve__list(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'username' in request_json:
for container in self.sync_docker_client.containers.list(filters=filters):
sieve_return = container.exec_run(["/bin/bash", "-c", "/usr/bin/doveadm sieve list -u '" + request_json['username'].replace("'", "'\\''") + "'"])
return self.exec_run_handler('utf8_text_only', sieve_return)
# api call: container_post - post_action: exec - cmd: sieve - task: print
def container_post__exec__sieve__print(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'username' in request_json and 'script_name' in request_json:
for container in self.sync_docker_client.containers.list(filters=filters):
cmd = ["/bin/bash", "-c", "/usr/bin/doveadm sieve get -u '" + request_json['username'].replace("'", "'\\''") + "' '" + request_json['script_name'].replace("'", "'\\''") + "'"]
sieve_return = container.exec_run(cmd)
return self.exec_run_handler('utf8_text_only', sieve_return)
# api call: container_post - post_action: exec - cmd: maildir - task: cleanup
def container_post__exec__maildir__cleanup(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'maildir' in request_json:
for container in self.sync_docker_client.containers.list(filters=filters):
sane_name = re.sub(r'\W+', '', request_json['maildir'])
vmail_name = request_json['maildir'].replace("'", "'\\''")
cmd_vmail = "if [[ -d '/var/vmail/" + vmail_name + "' ]]; then /bin/mv '/var/vmail/" + vmail_name + "' '/var/vmail/_garbage/" + str(int(time.time())) + "_" + sane_name + "'; fi"
index_name = request_json['maildir'].split("/")
if len(index_name) > 1:
index_name = index_name[1].replace("'", "'\\''") + "@" + index_name[0].replace("'", "'\\''")
cmd_vmail_index = "if [[ -d '/var/vmail_index/" + index_name + "' ]]; then /bin/mv '/var/vmail_index/" + index_name + "' '/var/vmail/_garbage/" + str(int(time.time())) + "_" + sane_name + "_index'; fi"
cmd = ["/bin/bash", "-c", cmd_vmail + " && " + cmd_vmail_index]
else:
cmd = ["/bin/bash", "-c", cmd_vmail]
maildir_cleanup = container.exec_run(cmd, user='vmail')
return self.exec_run_handler('generic', maildir_cleanup)
# api call: container_post - post_action: exec - cmd: rspamd - task: worker_password
def container_post__exec__rspamd__worker_password(self, request_json, **kwargs):
if 'container_id' in kwargs:
filters = {"id": kwargs['container_id']}
elif 'container_name' in kwargs:
filters = {"name": kwargs['container_name']}
if 'raw' in request_json:
for container in self.sync_docker_client.containers.list(filters=filters):
cmd = "/usr/bin/rspamadm pw -e -p '" + request_json['raw'].replace("'", "'\\''") + "' 2> /dev/null"
cmd_response = self.exec_cmd_container(container, cmd, user="_rspamd")
matched = False
for line in cmd_response.split("\n"):
if '$2$' in line:
hash = line.strip()
hash_out = re.search('\$2\$.+$', hash).group(0)
rspamd_passphrase_hash = re.sub('[^0-9a-zA-Z\$]+', '', hash_out.rstrip())
rspamd_password_filename = "/etc/rspamd/override.d/worker-controller-password.inc"
cmd = '''/bin/echo 'enable_password = "%s";' > %s && cat %s''' % (rspamd_passphrase_hash, rspamd_password_filename, rspamd_password_filename)
cmd_response = self.exec_cmd_container(container, cmd, user="_rspamd")
if rspamd_passphrase_hash.startswith("$2$") and rspamd_passphrase_hash in cmd_response:
container.restart()
matched = True
if matched:
res = { 'type': 'success', 'msg': 'command completed successfully' }
self.logger.info('success changing Rspamd password')
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
self.logger.error('failed changing Rspamd password')
res = { 'type': 'danger', 'msg': 'command did not complete' }
return Response(content=json.dumps(res, indent=4), media_type="application/json")
# Collect host stats
async def get_host_stats(self, wait=5):
try:
system_time = datetime.now()
host_stats = {
"cpu": {
"cores": psutil.cpu_count(),
"usage": psutil.cpu_percent()
},
"memory": {
"total": psutil.virtual_memory().total,
"usage": psutil.virtual_memory().percent,
"swap": psutil.swap_memory()
},
"uptime": time.time() - psutil.boot_time(),
"system_time": system_time.strftime("%d.%m.%Y %H:%M:%S"),
"architecture": platform.machine()
}
await self.redis_client.set('host_stats', json.dumps(host_stats), ex=10)
except Exception as e:
res = {
"type": "danger",
"msg": str(e)
}
await asyncio.sleep(wait)
self.host_stats_isUpdating = False
# Collect container stats
async def get_container_stats(self, container_id, wait=5, stop=False):
if container_id and container_id.isalnum():
try:
for container in (await self.async_docker_client.containers.list()):
if container._id == container_id:
res = await container.stats(stream=False)
if await self.redis_client.exists(container_id + '_stats'):
stats = json.loads(await self.redis_client.get(container_id + '_stats'))
else:
stats = []
stats.append(res[0])
if len(stats) > 3:
del stats[0]
await self.redis_client.set(container_id + '_stats', json.dumps(stats), ex=60)
except Exception as e:
res = {
"type": "danger",
"msg": str(e)
}
else:
res = {
"type": "danger",
"msg": "no or invalid id defined"
}
await asyncio.sleep(wait)
if stop == True:
# update task was called second time, stop
self.containerIds_to_update.remove(container_id)
else:
# call update task a second time
await self.get_container_stats(container_id, wait=0, stop=True)
def exec_cmd_container(self, container, cmd, user, timeout=2, shell_cmd="/bin/bash"):
def recv_socket_data(c_socket, timeout):
c_socket.setblocking(0)
total_data=[]
data=''
begin=time.time()
while True:
if total_data and time.time()-begin > timeout:
break
elif time.time()-begin > timeout*2:
break
try:
data = c_socket.recv(8192)
if data:
total_data.append(data.decode('utf-8'))
#change the beginning time for measurement
begin=time.time()
else:
#sleep for sometime to indicate a gap
time.sleep(0.1)
break
except:
pass
return ''.join(total_data)
try :
socket = container.exec_run([shell_cmd], stdin=True, socket=True, user=user).output._sock
if not cmd.endswith("\n"):
cmd = cmd + "\n"
socket.send(cmd.encode('utf-8'))
data = recv_socket_data(socket, timeout)
socket.close()
return data
except Exception as e:
self.logger.error("error - exec_cmd_container: %s" % str(e))
traceback.print_exc(file=sys.stdout)
def exec_run_handler(self, type, output):
if type == 'generic':
if output.exit_code == 0:
res = { 'type': 'success', 'msg': 'command completed successfully' }
return Response(content=json.dumps(res, indent=4), media_type="application/json")
else:
res = { 'type': 'danger', 'msg': 'command failed: ' + output.output.decode('utf-8') }
return Response(content=json.dumps(res, indent=4), media_type="application/json")
if type == 'utf8_text_only':
return Response(content=output.output.decode('utf-8'), media_type="text/plain")

View File

@@ -1,27 +1,17 @@
FROM debian:bullseye-slim
LABEL maintainer "The Infrastructure Company GmbH <info@servercow.de>"
FROM debian:bullseye-slim as build
LABEL maintainer "Andre Peters <andre.peters@servercow.de>"
ARG DEBIAN_FRONTEND=noninteractive
# renovate: datasource=github-tags depName=dovecot/core versioning=semver-coerced
ARG DOVECOT=2.3.20
# renovate: datasource=github-releases depName=tianon/gosu versioning=semver-coerced
ARG GOSU_VERSION=1.16
ARG DOVECOT=2.3.19.1
ARG FLATCURVE=v0.3.2
ARG XAPIAN=1.4.21
ENV LC_ALL C
# Add groups and users before installing Dovecot to not break compatibility
RUN groupadd -g 5000 vmail \
&& groupadd -g 401 dovecot \
&& groupadd -g 402 dovenull \
&& groupadd -g 999 sogo \
&& usermod -a -G sogo nobody \
&& useradd -g vmail -u 5000 vmail -d /var/vmail \
&& useradd -c "Dovecot unprivileged user" -d /dev/null -u 401 -g dovecot -s /bin/false dovecot \
&& useradd -c "Dovecot login user" -d /dev/null -u 402 -g dovenull -s /bin/false dovenull \
&& touch /etc/default/locale \
RUN touch /etc/default/locale \
&& apt-get update \
&& apt-get -y --no-install-recommends install \
build-essential \
apt-transport-https \
ca-certificates \
cpanminus \
@@ -62,7 +52,6 @@ RUN groupadd -g 5000 vmail \
libproc-processtable-perl \
libreadonly-perl \
libregexp-common-perl \
libssl-dev \
libsys-meminfo-perl \
libterm-readkey-perl \
libtest-deep-perl \
@@ -78,7 +67,13 @@ RUN groupadd -g 5000 vmail \
libunicode-string-perl \
liburi-perl \
libwww-perl \
libstemmer-dev \
libexttextcat-dev \
libldap-dev \
libghc-bzlib-dev \
lua-sql-mysql \
liblz4-dev \
libzstd-dev \
lua-socket \
mariadb-client \
procps \
@@ -89,32 +84,152 @@ RUN groupadd -g 5000 vmail \
syslog-ng-core \
syslog-ng-mod-redis \
wget \
git \
bison \
flex \
build-essential \
autoconf \
automake \
libtool \
make \
libxapian-dev \
default-libmysqlclient-dev \
libicu-dev \
zlib1g-dev \
pkg-config \
libsqlite3-dev \
liblua5.3-dev \
&& dpkgArch="$(dpkg --print-architecture | awk -F- '{ print $NF }')" \
&& wget -O /usr/local/bin/gosu "https://github.com/tianon/gosu/releases/download/$GOSU_VERSION/gosu-$dpkgArch" \
&& chmod +x /usr/local/bin/gosu \
&& gosu nobody true \
&& apt-key adv --fetch-keys https://repo.dovecot.org/DOVECOT-REPO-GPG \
&& echo "deb https://repo.dovecot.org/ce-${DOVECOT}/debian/bullseye bullseye main" > /etc/apt/sources.list.d/dovecot.list \
&& apt-get update \
&& apt-get -y --no-install-recommends install \
dovecot-lua \
dovecot-managesieved \
dovecot-sieve \
dovecot-lmtpd \
dovecot-ldap \
dovecot-mysql \
dovecot-core \
dovecot-pop3d \
dovecot-imapd \
dovecot-solr \
&& pip3 install mysql-connector-python html2text jinja2 redis \
&& apt-get autoremove --purge -y \
&& apt-get autoclean \
&& rm -rf /var/lib/apt/lists/* \
&& rm -rf /tmp/* /var/tmp/* /root/.cache/
# imapsync dependencies
RUN cpan Crypt::OpenSSL::PKCS12
&& gosu nobody true
# && apt-key adv --fetch-keys https://repo.dovecot.org/DOVECOT-REPO-GPG \
# && echo "deb https://repo.dovecot.org/ce-${DOVECOT}/debian/bullseye bullseye main" > /etc/apt/sources.list.d/dovecot.list \
# && apt-get update \
# && apt-get -y --no-install-recommends install \
# dovecot-lua \
# dovecot-managesieved \
# dovecot-sieve \
# dovecot-lmtpd \
# dovecot-ldap \
# dovecot-mysql \
# dovecot-core \
# dovecot-pop3d \
# dovecot-imapd \
# dovecot-dev
RUN cd /tmp && git clone --depth 1 --branch ${DOVECOT} https://github.com/dovecot/core.git dovecot/core && cd dovecot/core \
&& ./autogen.sh \
&& PANDOC=false ./configure --prefix=/usr --sysconfdir=/etc --localstatedir=/var --with-ssldir=/etc/ssl --enable-maintainer-mode --with-sql=yes --with-lua=yes --with-mysql --with-ldap --with-zstd --with-lz4 --with-ssl=openssl --with-notify=inotify --with-bzlib --with-zlib --enable-hardening --with-stemmer --with-textcat --with-icu \
&& make -j6 \
&& make install \
&& make clean
RUN cd /tmp && git clone --depth 1 --branch release-0.5 https://github.com/dovecot/pigeonhole dovecot/pigeonhole && cd dovecot/pigeonhole \
&& ./autogen.sh \
&& ./configure --with-dovecot=/usr/lib/dovecot --with-managesieve\
&& make -j6 \
&& make install \
&& make clean
RUN cd /tmp && wget https://oligarchy.co.uk/xapian/${XAPIAN}/xapian-core-${XAPIAN}.tar.xz && tar xf xapian-core-${XAPIAN}.tar.xz && cd xapian-core-${XAPIAN} \
&& ./configure --prefix=/usr/local/xapian \
&& make -j6 \
&& make install \
&& make clean
RUN cd /tmp && git clone --depth 1 --branch ${FLATCURVE} https://github.com/slusarz/dovecot-fts-flatcurve.git dovecot/flatcurve && cd dovecot/flatcurve \
&& ./autogen.sh \
&& ./configure --with-dovecot=/usr/lib/dovecot \
&& make -j6 \
&& make install \
&& make clean
FROM debian:bullseye-slim
RUN groupadd -g 5000 vmail \
&& groupadd -g 401 dovecot \
&& groupadd -g 402 dovenull \
&& groupadd -g 999 sogo \
&& usermod -a -G sogo nobody \
&& useradd -g vmail -u 5000 vmail -d /var/vmail \
&& useradd -c "Dovecot unprivileged user" -d /dev/null -u 401 -g dovecot -s /bin/false dovecot \
&& useradd -c "Dovecot login user" -d /dev/null -u 402 -g dovenull -s /bin/false dovenull \
&& apt update && apt install lua-socket \
mariadb-client \
libstemmer-dev \
libexttextcat-dev \
libicu-dev \
libxapian-dev \
libsqlite3-dev \
liblua5.3-dev \
lua-sql-mysql \
libldap-dev \
procps \
python3-pip \
redis-server \
supervisor \
syslog-ng \
syslog-ng-core \
syslog-ng-mod-redis \
cpanminus \
curl \
libauthen-ntlm-perl \
libcgi-pm-perl \
libcrypt-openssl-rsa-perl \
libcrypt-ssleay-perl \
libdata-uniqid-perl \
libdbd-mysql-perl \
libdbi-perl \
libdigest-hmac-perl \
libdist-checkconflicts-perl \
libencode-imaputf7-perl \
libfile-copy-recursive-perl \
libfile-tail-perl \
libhtml-parser-perl \
libio-compress-perl \
libio-socket-inet6-perl \
libio-socket-ssl-perl \
libio-tee-perl \
libipc-run-perl \
libjson-webtoken-perl \
liblockfile-simple-perl \
libmail-imapclient-perl \
libmodule-implementation-perl \
libmodule-scandeps-perl \
libnet-ssleay-perl \
libpackage-stash-perl \
libpackage-stash-xs-perl \
libpar-packer-perl \
libparse-recdescent-perl \
libproc-processtable-perl \
libreadonly-perl \
libregexp-common-perl \
libsys-meminfo-perl \
libterm-readkey-perl \
libtest-deep-perl \
libtest-fatal-perl \
libtest-mock-guard-perl \
libtest-mockobject-perl \
libtest-nowarnings-perl \
libtest-pod-perl \
libtest-requires-perl \
libtest-simple-perl \
libtest-warn-perl \
libtry-tiny-perl \
libunicode-string-perl \
liburi-perl \
libwww-perl \
dnsutils \
gettext-base -y --no-install-recommends \
&& pip3 install mysql-connector-python html2text jinja2 redis
COPY --from=build /usr/lib/dovecot /usr/lib/dovecot
COPY --from=build /usr/bin/doveconf /usr/bin/doveconf
COPY --from=build /usr/bin/doveadm /usr/bin/doveadm
COPY --from=build /usr/bin/dovecot-sysreport /usr/bin/dovecot-sysreport
COPY --from=build /usr/sbin/dovecot /usr/sbin/dovecot
COPY --from=build /usr/libexec/dovecot/ /usr/libexec/dovecot/
COPY --from=build /usr/local/bin /usr/local/bin
COPY --from=build /usr/local/xapian/ /usr/local/xapian
COPY trim_logs.sh /usr/local/bin/trim_logs.sh
COPY clean_q_aged.sh /usr/local/bin/clean_q_aged.sh
COPY syslog-ng.conf /etc/syslog-ng/syslog-ng.conf

View File

@@ -109,17 +109,19 @@ EOF
echo -n ${ACL_ANYONE} > /etc/dovecot/acl_anyone
if [[ "${SKIP_SOLR}" =~ ^([yY][eE][sS]|[yY])+$ ]]; then
if [[ "${SKIP_XAPIAN}" =~ ^([yY][eE][sS]|[yY])+$ ]]; then
echo -n 'quota acl zlib mail_crypt mail_crypt_acl mail_log notify listescape replication' > /etc/dovecot/mail_plugins
echo -n 'quota imap_quota imap_acl acl zlib imap_zlib imap_sieve mail_crypt mail_crypt_acl notify listescape replication mail_log' > /etc/dovecot/mail_plugins_imap
echo -n 'quota sieve acl zlib mail_crypt mail_crypt_acl notify listescape replication' > /etc/dovecot/mail_plugins_lmtp
else
echo -n 'quota acl zlib mail_crypt mail_crypt_acl mail_log notify fts fts_solr listescape replication' > /etc/dovecot/mail_plugins
echo -n 'quota imap_quota imap_acl acl zlib imap_zlib imap_sieve mail_crypt mail_crypt_acl notify mail_log fts fts_solr listescape replication' > /etc/dovecot/mail_plugins_imap
echo -n 'quota sieve acl zlib mail_crypt mail_crypt_acl fts fts_solr notify listescape replication' > /etc/dovecot/mail_plugins_lmtp
echo -n 'quota acl zlib mail_crypt mail_crypt_acl mail_log notify fts fts_flatcurve listescape replication' > /etc/dovecot/mail_plugins
echo -n 'quota imap_quota imap_acl acl zlib imap_zlib imap_sieve mail_crypt mail_crypt_acl notify mail_log fts fts_flatcurve listescape replication' > /etc/dovecot/mail_plugins_imap
echo -n 'quota sieve acl zlib mail_crypt mail_crypt_acl fts fts_flatcurve notify listescape replication' > /etc/dovecot/mail_plugins_lmtp
fi
chmod 644 /etc/dovecot/mail_plugins /etc/dovecot/mail_plugins_imap /etc/dovecot/mail_plugins_lmtp /templates/quarantine.tpl
sed -i 's/vsz_limit.*/vsz_limit = '${XAPIAN_HEAP}m/g /etc/dovecot/dovecot-fts-flatcurve.conf
cat <<EOF > /etc/dovecot/sql/dovecot-dict-sql-userdb.conf
# Autogenerated by mailcow
driver = mysql
@@ -159,7 +161,7 @@ function auth_password_verify(req, pass)
VALUES ("%s", 0, "%s", "%s")]], con:escape(req.service), con:escape(req.user), con:escape(req.real_rip)))
cur:close()
con:close()
return dovecot.auth.PASSDB_RESULT_OK, ""
return dovecot.auth.PASSDB_RESULT_OK, "password=" .. pass
end
row = cur:fetch (row, "a")
end
@@ -180,13 +182,13 @@ function auth_password_verify(req, pass)
if tostring(req.real_rip) == "__IPV4_SOGO__" then
cur:close()
con:close()
return dovecot.auth.PASSDB_RESULT_OK, ""
return dovecot.auth.PASSDB_RESULT_OK, "password=" .. pass
elseif row.has_prot_access == "1" then
con:execute(string.format([[REPLACE INTO sasl_log (service, app_password, username, real_rip)
VALUES ("%s", %d, "%s", "%s")]], con:escape(req.service), row.id, con:escape(req.user), con:escape(req.real_rip)))
cur:close()
con:close()
return dovecot.auth.PASSDB_RESULT_OK, ""
return dovecot.auth.PASSDB_RESULT_OK, "password=" .. pass
end
end
row = cur:fetch (row, "a")
@@ -315,14 +317,8 @@ remote ${IPV4_NETWORK}.248 {
}
EOF
# Set SOGo SSO master Password
if [ -z "$SOGO_SSO_PASS" ]; then
# Set from env var
RAND_PASS=$SOGO_SSO_PASS
else
# Create random master Password
RAND_PASS=$(cat /dev/urandom | tr -dc 'a-z0-9' | fold -w 32 | head -n 1)
fi
# Create random master Password for SOGo SSO
RAND_PASS=$(cat /dev/urandom | tr -dc 'a-z0-9' | fold -w 32 | head -n 1)
echo -n ${RAND_PASS} > /etc/phpfpm/sogo-sso.pass
cat <<EOF > /etc/dovecot/sogo-sso.conf
# Autogenerated by mailcow

View File

@@ -8492,7 +8492,6 @@ sub xoauth2
require HTML::Entities ;
require JSON ;
require JSON::WebToken::Crypt::RSA ;
require Crypt::OpenSSL::PKCS12;
require Crypt::OpenSSL::RSA ;
require Encode::Byte ;
require IO::Socket::SSL ;
@@ -8533,9 +8532,8 @@ sub xoauth2
$sync->{ debug } and myprint( "Service account: $iss\nKey file: $keyfile\nKey password: $keypass\n");
# Get private key from p12 file
my $pkcs12 = Crypt::OpenSSL::PKCS12->new_from_file($keyfile);
$key = $pkcs12->private_key($keypass);
# Get private key from p12 file (would be better in perl...)
$key = `openssl pkcs12 -in "$keyfile" -nodes -nocerts -passin pass:$keypass -nomacver`;
$sync->{ debug } and myprint( "Private key:\n$key\n");
}

View File

@@ -1,5 +1,5 @@
FROM alpine:3.17
LABEL maintainer "The Infrastructure Company GmbH <info@servercow.de>"
LABEL maintainer "Andre Peters <andre.peters@servercow.de>"
ENV XTABLES_LIBDIR /usr/lib/xtables
ENV PYTHON_IPTABLES_XTABLES_VERSION 12

View File

@@ -64,40 +64,28 @@ def refreshF2boptions():
global f2boptions
global quit_now
global exit_code
f2boptions = {}
if not r.get('F2B_OPTIONS'):
f2boptions['ban_time'] = r.get('F2B_BAN_TIME')
f2boptions['max_ban_time'] = r.get('F2B_MAX_BAN_TIME')
f2boptions['ban_time_increment'] = r.get('F2B_BAN_TIME_INCREMENT')
f2boptions['max_attempts'] = r.get('F2B_MAX_ATTEMPTS')
f2boptions['retry_window'] = r.get('F2B_RETRY_WINDOW')
f2boptions['netban_ipv4'] = r.get('F2B_NETBAN_IPV4')
f2boptions['netban_ipv6'] = r.get('F2B_NETBAN_IPV6')
f2boptions = {}
f2boptions['ban_time'] = int
f2boptions['max_attempts'] = int
f2boptions['retry_window'] = int
f2boptions['netban_ipv4'] = int
f2boptions['netban_ipv6'] = int
f2boptions['ban_time'] = r.get('F2B_BAN_TIME') or 1800
f2boptions['max_attempts'] = r.get('F2B_MAX_ATTEMPTS') or 10
f2boptions['retry_window'] = r.get('F2B_RETRY_WINDOW') or 600
f2boptions['netban_ipv4'] = r.get('F2B_NETBAN_IPV4') or 32
f2boptions['netban_ipv6'] = r.get('F2B_NETBAN_IPV6') or 128
r.set('F2B_OPTIONS', json.dumps(f2boptions, ensure_ascii=False))
else:
try:
f2boptions = {}
f2boptions = json.loads(r.get('F2B_OPTIONS'))
except ValueError:
print('Error loading F2B options: F2B_OPTIONS is not json')
quit_now = True
exit_code = 2
verifyF2boptions(f2boptions)
r.set('F2B_OPTIONS', json.dumps(f2boptions, ensure_ascii=False))
def verifyF2boptions(f2boptions):
verifyF2boption(f2boptions,'ban_time', 1800)
verifyF2boption(f2boptions,'max_ban_time', 10000)
verifyF2boption(f2boptions,'ban_time_increment', True)
verifyF2boption(f2boptions,'max_attempts', 10)
verifyF2boption(f2boptions,'retry_window', 600)
verifyF2boption(f2boptions,'netban_ipv4', 32)
verifyF2boption(f2boptions,'netban_ipv6', 128)
def verifyF2boption(f2boptions, f2boption, f2bdefault):
f2boptions[f2boption] = f2boptions[f2boption] if f2boption in f2boptions and f2boptions[f2boption] is not None else f2bdefault
def refreshF2bregex():
global f2bregex
global quit_now
@@ -159,7 +147,6 @@ def ban(address):
global lock
refreshF2boptions()
BAN_TIME = int(f2boptions['ban_time'])
BAN_TIME_INCREMENT = bool(f2boptions['ban_time_increment'])
MAX_ATTEMPTS = int(f2boptions['max_attempts'])
RETRY_WINDOW = int(f2boptions['retry_window'])
NETBAN_IPV4 = '/' + str(f2boptions['netban_ipv4'])
@@ -187,16 +174,20 @@ def ban(address):
net = ipaddress.ip_network((address + (NETBAN_IPV4 if type(ip) is ipaddress.IPv4Address else NETBAN_IPV6)), strict=False)
net = str(net)
if not net in bans:
bans[net] = {'attempts': 0, 'last_attempt': 0, 'ban_counter': 0}
if not net in bans or time.time() - bans[net]['last_attempt'] > RETRY_WINDOW:
bans[net] = { 'attempts': 0 }
active_window = RETRY_WINDOW
else:
active_window = time.time() - bans[net]['last_attempt']
bans[net]['attempts'] += 1
bans[net]['last_attempt'] = time.time()
active_window = time.time() - bans[net]['last_attempt']
if bans[net]['attempts'] >= MAX_ATTEMPTS:
cur_time = int(round(time.time()))
NET_BAN_TIME = BAN_TIME if not BAN_TIME_INCREMENT else BAN_TIME * 2 ** bans[net]['ban_counter']
logCrit('Banning %s for %d minutes' % (net, NET_BAN_TIME / 60 ))
logCrit('Banning %s for %d minutes' % (net, BAN_TIME / 60))
if type(ip) is ipaddress.IPv4Address:
with lock:
chain = iptc.Chain(iptc.Table(iptc.Table.FILTER), 'MAILCOW')
@@ -215,7 +206,7 @@ def ban(address):
rule.target = target
if rule not in chain.rules:
chain.insert_rule(rule)
r.hset('F2B_ACTIVE_BANS', '%s' % net, cur_time + NET_BAN_TIME)
r.hset('F2B_ACTIVE_BANS', '%s' % net, cur_time + BAN_TIME)
else:
logWarn('%d more attempts in the next %d seconds until %s is banned' % (MAX_ATTEMPTS - bans[net]['attempts'], RETRY_WINDOW, net))
@@ -247,8 +238,7 @@ def unban(net):
r.hdel('F2B_ACTIVE_BANS', '%s' % net)
r.hdel('F2B_QUEUE_UNBAN', '%s' % net)
if net in bans:
bans[net]['attempts'] = 0
bans[net]['ban_counter'] += 1
del bans[net]
def permBan(net, unban=False):
global lock
@@ -342,7 +332,7 @@ def watch():
logWarn('%s matched rule id %s (%s)' % (addr, rule_id, item['data']))
ban(addr)
except Exception as ex:
logWarn('Error reading log line from pubsub: %s' % ex)
logWarn('Error reading log line from pubsub')
quit_now = True
exit_code = 2
@@ -376,8 +366,6 @@ def snat4(snat_target):
chain.insert_rule(new_rule)
else:
for position, rule in enumerate(chain.rules):
if not hasattr(rule.target, 'parameter'):
continue
match = all((
new_rule.get_src() == rule.get_src(),
new_rule.get_dst() == rule.get_dst(),
@@ -437,8 +425,6 @@ def autopurge():
time.sleep(10)
refreshF2boptions()
BAN_TIME = int(f2boptions['ban_time'])
MAX_BAN_TIME = int(f2boptions['max_ban_time'])
BAN_TIME_INCREMENT = bool(f2boptions['ban_time_increment'])
MAX_ATTEMPTS = int(f2boptions['max_attempts'])
QUEUE_UNBAN = r.hgetall('F2B_QUEUE_UNBAN')
if QUEUE_UNBAN:
@@ -446,9 +432,7 @@ def autopurge():
unban(str(net))
for net in bans.copy():
if bans[net]['attempts'] >= MAX_ATTEMPTS:
NET_BAN_TIME = BAN_TIME if not BAN_TIME_INCREMENT else BAN_TIME * 2 ** bans[net]['ban_counter']
TIME_SINCE_LAST_ATTEMPT = time.time() - bans[net]['last_attempt']
if TIME_SINCE_LAST_ATTEMPT > NET_BAN_TIME or TIME_SINCE_LAST_ATTEMPT > MAX_BAN_TIME:
if time.time() - bans[net]['last_attempt'] > BAN_TIME:
unban(net)
def isIpNetwork(address):

View File

@@ -1,5 +1,5 @@
FROM alpine:3.17
LABEL maintainer "The Infrastructure Company GmbH <info@servercow.de>"
LABEL maintainer "Andre Peters <andre.peters@servercow.de>"
WORKDIR /app

View File

@@ -1,5 +1,5 @@
FROM php:8.2-fpm-alpine3.17
LABEL maintainer "The Infrastructure Company GmbH <info@servercow.de>"
FROM php:8.1-fpm-alpine3.17
LABEL maintainer "Andre Peters <andre.peters@servercow.de>"
# renovate: datasource=github-tags depName=krakjoe/apcu versioning=semver-coerced
ARG APCU_PECL_VERSION=5.1.22
@@ -12,7 +12,7 @@ ARG MEMCACHED_PECL_VERSION=3.2.0
# renovate: datasource=github-tags depName=phpredis/phpredis versioning=semver-coerced
ARG REDIS_PECL_VERSION=5.3.7
# renovate: datasource=github-tags depName=composer/composer versioning=semver-coerced
ARG COMPOSER_VERSION=2.5.5
ARG COMPOSER_VERSION=2.5.4
RUN apk add -U --no-cache autoconf \
aspell-dev \
@@ -52,7 +52,6 @@ RUN apk add -U --no-cache autoconf \
libxpm-dev \
libzip \
libzip-dev \
linux-headers \
make \
mysql-client \
openldap-dev \
@@ -76,7 +75,7 @@ RUN apk add -U --no-cache autoconf \
--with-webp \
--with-xpm \
--with-avif \
&& docker-php-ext-install -j 4 exif gd gettext intl ldap opcache pcntl pdo pdo_mysql pspell soap sockets sysvsem zip bcmath gmp \
&& docker-php-ext-install -j 4 exif gd gettext intl ldap opcache pcntl pdo pdo_mysql pspell soap sockets zip bcmath gmp \
&& docker-php-ext-configure imap --with-imap --with-imap-ssl \
&& docker-php-ext-install -j 4 imap \
&& curl --silent --show-error https://getcomposer.org/installer | php -- --version=${COMPOSER_VERSION} \
@@ -100,7 +99,6 @@ RUN apk add -U --no-cache autoconf \
libxml2-dev \
libxpm-dev \
libzip-dev \
linux-headers \
make \
openldap-dev \
pcre-dev \

View File

@@ -172,24 +172,6 @@ BEGIN
END;
//
DELIMITER ;
DROP EVENT IF EXISTS clean_sasl_log;
DELIMITER //
CREATE EVENT clean_sasl_log
ON SCHEDULE EVERY 1 DAY DO
BEGIN
DELETE sasl_log.* FROM sasl_log
LEFT JOIN (
SELECT username, service, MAX(datetime) AS lastdate
FROM sasl_log
GROUP BY username, service
) AS last ON sasl_log.username = last.username AND sasl_log.service = last.service
WHERE datetime < DATE_SUB(NOW(), INTERVAL 31 DAY) AND datetime < lastdate;
DELETE FROM sasl_log
WHERE username NOT IN (SELECT username FROM mailbox) AND
datetime < DATE_SUB(NOW(), INTERVAL 31 DAY);
END;
//
DELIMITER ;
EOF
fi

View File

@@ -1,5 +1,5 @@
FROM debian:bullseye-slim
LABEL maintainer "The Infrastructure Company GmbH <info@servercow.de>"
LABEL maintainer "Andre Peters <andre.peters@servercow.de>"
ARG DEBIAN_FRONTEND=noninteractive
ENV LC_ALL C

View File

@@ -1,5 +1,5 @@
FROM debian:bullseye-slim
LABEL maintainer "The Infrastructure Company GmbH <info@servercow.de>"
LABEL maintainer "Andre Peters <andre.peters@tinc.gmbh>"
ARG DEBIAN_FRONTEND=noninteractive
ARG CODENAME=bullseye

View File

@@ -1,5 +1,5 @@
FROM debian:bullseye-slim
LABEL maintainer "The Infrastructure Company GmbH <info@servercow.de>"
LABEL maintainer "Andre Peters <andre.peters@servercow.de>"
ARG DEBIAN_FRONTEND=noninteractive
ARG SOGO_DEBIAN_REPOSITORY=http://packages.sogo.nu/nightly/5/debian/

View File

@@ -1,6 +1,6 @@
FROM alpine:3.17
LABEL maintainer "The Infrastructure Company GmbH <info@servercow.de>"
LABEL maintainer "Andre Peters <andre.peters@servercow.de>"
RUN apk add --update --no-cache \
curl \

View File

@@ -24,7 +24,7 @@ server {
add_header X-Download-Options "noopen" always;
add_header X-Frame-Options "SAMEORIGIN" always;
add_header X-Permitted-Cross-Domain-Policies "none" always;
add_header X-Robots-Tag "noindex, nofollow" always;
add_header X-Robots-Tag "none" always;
add_header X-XSS-Protection "1; mode=block" always;
fastcgi_hide_header X-Powered-By;

View File

@@ -0,0 +1,23 @@
plugin {
fts = flatcurve
fts_autoindex = yes
fts_languages = en de
fts_tokenizers = generic email-address
fts_tokenizer_generic = algorithm=simple
# All of these are optional, and indicate the default values.
# They are listed here for documentation purposes; most people should
# not need to define/override in their config.
fts_flatcurve_commit_limit = 500
fts_flatcurve_max_term_size = 30
fts_flatcurve_min_term_size = 2
fts_flatcurve_optimize_limit = 10
fts_flatcurve_rotate_size = 5000
fts_flatcurve_rotate_time = 5000
fts_flatcurve_substring_search = yes
}
service indexer-worker {
vsz_limit = 1024m
}

View File

@@ -11,6 +11,7 @@ auth_mechanisms = plain login
#mail_debug = yes
#auth_debug = yes
log_path = syslog
log_debug = category=fts-flatcurve
disable_plaintext_auth = yes
# Uncomment on NFS share
#mmap_disable = yes
@@ -24,11 +25,6 @@ mail_plugins = </etc/dovecot/mail_plugins
mail_attachment_fs = crypt:set_prefix=mail_crypt_global:posix:
mail_attachment_dir = /var/attachments
mail_attachment_min_size = 128k
# Significantly speeds up very large mailboxes, but is only safe to enable if
# you do not manually modify the files in the `cur` directories in
# mailcowdockerized_vmail-vol-1.
# https://docs.mailcow.email/manual-guides/Dovecot/u_e-dovecot-performance/
maildir_very_dirty_syncs = yes
# Dovecot 2.2
#ssl_protocols = !SSLv3
@@ -194,9 +190,6 @@ plugin {
acl_shared_dict = file:/var/vmail/shared-mailboxes.db
acl = vfile
acl_user = %u
fts = solr
fts_autoindex = yes
fts_solr = url=http://solr:8983/solr/dovecot-fts/
quota = dict:Userquota::proxy::sqlquota
quota_rule2 = Trash:storage=+100%%
sieve = /var/vmail/sieve/%u.sieve
@@ -247,6 +240,7 @@ plugin {
mail_log_events = delete undelete expunge copy mailbox_delete mailbox_rename
mail_log_fields = uid box msgid size
mail_log_cached_only = yes
}
service quota-warning {
executable = script /usr/local/bin/quota_notify.py
@@ -302,6 +296,7 @@ replication_dsync_parameters = -d -l 30 -U -n INBOX
!include_try /etc/dovecot/extra.conf
!include_try /etc/dovecot/sogo-sso.conf
!include_try /etc/dovecot/shared_namespace.conf
!include_try /etc/dovecot/dovecot-fts-flatcurve.conf
# </Includes>
default_client_limit = 10400
default_vsz_limit = 1024 M

View File

@@ -27,5 +27,4 @@
#197518 2 #Rackmarkt SL, Spain
#197695 2 #Domain names registrar REG.RU Ltd, Russia
#198068 2 #P.A.G.M. OU, Estonia
#201942 5 #Soltia Consulting SL, Spain
#213373 4 #IP Connect Inc
#201942 5 #Soltia Consulting SL, Spain

View File

@@ -8,7 +8,7 @@ VIRUS_FOUND {
}
# Bad policy from free mail providers
FREEMAIL_POLICY_FAILURE {
expression = "FREEMAIL_FROM & !DMARC_POLICY_ALLOW & !MAILLIST& !WHITELISTED_FWD_HOST & -g+:policies";
expression = "-g+:policies & !DMARC_POLICY_ALLOW & !MAILLIST & ( FREEMAIL_ENVFROM | FREEMAIL_FROM ) & !WHITELISTED_FWD_HOST";
score = 16.0;
}
# Applies to freemail with undisclosed recipients

View File

@@ -159,8 +159,8 @@ BAZAAR_ABUSE_CH {
}
URLHAUS_ABUSE_CH {
type = "selector";
selector = "urls";
type = "url";
filter = "full";
map = "https://urlhaus.abuse.ch/downloads/text_online/";
score = 10.0;
}

View File

@@ -62,7 +62,7 @@
SOGoFirstDayOfWeek = "1";
SOGoSieveFolderEncoding = "UTF-8";
SOGoPasswordChangeEnabled = NO;
SOGoPasswordChangeEnabled = YES;
SOGoSentFolderName = "Sent";
SOGoMailShowSubscribedFoldersOnly = NO;
NGImap4ConnectionStringSeparator = "/";

View File

@@ -3176,10 +3176,8 @@ paths:
example:
attr:
ban_time: "86400"
ban_time_increment: "1"
blacklist: "10.100.6.5/32,10.100.8.4/32"
max_attempts: "5"
max_ban_time: "86400"
netban_ipv4: "24"
netban_ipv6: "64"
retry_window: "600"
@@ -3193,17 +3191,11 @@ paths:
description: the backlisted ips or hostnames separated by comma
type: string
ban_time:
description: the time an ip should be banned
description: the time a ip should be banned
type: number
ban_time_increment:
description: if the time of the ban should increase each time
type: boolean
max_attempts:
description: the maximum numbe of wrong logins before a ip is banned
type: number
max_ban_time:
description: the maximum time an ip should be banned
type: number
netban_ipv4:
description: the networks mask to ban for ipv4
type: number
@@ -4121,12 +4113,10 @@ paths:
response:
value:
ban_time: 604800
ban_time_increment: 1
blacklist: |-
45.82.153.37/32
92.118.38.52/32
max_attempts: 1
max_ban_time: 604800
netban_ipv4: 32
netban_ipv6: 128
perm_bans:

View File

@@ -342,10 +342,6 @@ div.dataTables_wrapper div.dt-row {
position: relative;
}
div.dataTables_wrapper span.sorting-value {
display: none;
}
div.dataTables_scrollHead table.dataTable {
margin-bottom: 0 !important;
}

View File

@@ -66,6 +66,4 @@ table tbody tr td input[type="checkbox"] {
padding: .2em .4em .3em !important;
background-color: #ececec!important;
}
.badge.bg-info .bi {
font-size: inherit;
}

View File

@@ -20,11 +20,6 @@ legend {
background-color: #7a7a7a !important;
border-color: #5c5c5c !important;
}
.btn-dark {
color: #000 !important;;
background-color: #f6f6f6 !important;;
border-color: #ddd !important;;
}
.btn-check:checked+.btn-secondary, .btn-check:active+.btn-secondary, .btn-secondary:active, .btn-secondary.active, .show>.btn-secondary.dropdown-toggle {
border-color: #7a7a7a !important;
}
@@ -304,22 +299,22 @@ a:hover {
}
table.dataTable.dtr-inline.collapsed>tbody>tr>td.dtr-control:before:hover,
table.dataTable.dtr-inline.collapsed>tbody>tr>td.dtr-control:before:hover,
table.dataTable.dtr-inline.collapsed>tbody>tr>th.dtr-control:before:hover {
background-color: #7a7a7a !important;
}
table.dataTable.dtr-inline.collapsed>tbody>tr>td.dtr-control:before,
table.dataTable.dtr-inline.collapsed>tbody>tr>td.dtr-control:before,
table.dataTable.dtr-inline.collapsed>tbody>tr>th.dtr-control:before {
background-color: #7a7a7a !important;
border: 1.5px solid #5c5c5c !important;
color: #fff !important;
}
table.dataTable.dtr-inline.collapsed>tbody>tr.parent>td.dtr-control:before,
table.dataTable.dtr-inline.collapsed>tbody>tr.parent>td.dtr-control:before,
table.dataTable.dtr-inline.collapsed>tbody>tr.parent>th.dtr-control:before {
background-color: #949494;
}
table.dataTable.dtr-inline.collapsed>tbody>tr>td.child,
table.dataTable.dtr-inline.collapsed>tbody>tr>th.child,
table.dataTable.dtr-inline.collapsed>tbody>tr>td.child,
table.dataTable.dtr-inline.collapsed>tbody>tr>th.child,
table.dataTable.dtr-inline.collapsed>tbody>tr>td.dataTables_empty {
background-color: #444444;
}
@@ -332,7 +327,7 @@ table.dataTable.dtr-inline.collapsed>tbody>tr>td.dataTables_empty {
}
.btn.btn-outline-secondary {
color: #fff !important;
border-color: #7a7a7a !important;
border-color: #7a7a7a !important;
}
.btn-check:checked+.btn-outline-secondary, .btn-check:active+.btn-outline-secondary, .btn-outline-secondary:active, .btn-outline-secondary.active, .btn-outline-secondary.dropdown-toggle.show {
background-color: #9b9b9b !important;

View File

@@ -49,9 +49,7 @@ function bcc($_action, $_data = null, $_attr = null) {
}
elseif (filter_var($local_dest, FILTER_VALIDATE_EMAIL)) {
$mailbox = mailbox('get', 'mailbox_details', $local_dest);
$shared_aliases = mailbox('get', 'shared_aliases');
$direct_aliases = mailbox('get', 'direct_aliases');
if ($mailbox === false && in_array($local_dest, array_merge($direct_aliases, $shared_aliases)) === false) {
if ($mailbox === false && array_key_exists($local_dest, array_merge($direct_aliases, $shared_aliases)) === false) {
$_SESSION['return'][] = array(
'type' => 'danger',
'log' => array(__FUNCTION__, $_action, $_data, $_attr),

View File

@@ -192,16 +192,5 @@ function docker($action, $service_name = null, $attr1 = null, $attr2 = null, $ex
}
return false;
break;
case 'broadcast':
$request = array(
"api_call" => "container_post",
"container_name" => $service_name,
"post_action" => $attr1,
"request" => $attr2
);
$redis->publish("MC_CHANNEL", json_encode($request));
return true;
break;
}
}

View File

@@ -239,9 +239,7 @@ function fail2ban($_action, $_data = null) {
$is_now = fail2ban('get');
if (!empty($is_now)) {
$ban_time = intval((isset($_data['ban_time'])) ? $_data['ban_time'] : $is_now['ban_time']);
$ban_time_increment = (isset($_data['ban_time_increment']) && $_data['ban_time_increment'] == "1") ? 1 : 0;
$max_attempts = intval((isset($_data['max_attempts'])) ? $_data['max_attempts'] : $is_now['max_attempts']);
$max_ban_time = intval((isset($_data['max_ban_time'])) ? $_data['max_ban_time'] : $is_now['max_ban_time']);
$retry_window = intval((isset($_data['retry_window'])) ? $_data['retry_window'] : $is_now['retry_window']);
$netban_ipv4 = intval((isset($_data['netban_ipv4'])) ? $_data['netban_ipv4'] : $is_now['netban_ipv4']);
$netban_ipv6 = intval((isset($_data['netban_ipv6'])) ? $_data['netban_ipv6'] : $is_now['netban_ipv6']);
@@ -258,8 +256,6 @@ function fail2ban($_action, $_data = null) {
}
$f2b_options = array();
$f2b_options['ban_time'] = ($ban_time < 60) ? 60 : $ban_time;
$f2b_options['ban_time_increment'] = ($ban_time_increment == 1) ? true : false;
$f2b_options['max_ban_time'] = ($max_ban_time < 60) ? 60 : $max_ban_time;
$f2b_options['netban_ipv4'] = ($netban_ipv4 < 8) ? 8 : $netban_ipv4;
$f2b_options['netban_ipv6'] = ($netban_ipv6 < 8) ? 8 : $netban_ipv6;
$f2b_options['netban_ipv4'] = ($netban_ipv4 > 32) ? 32 : $netban_ipv4;

View File

@@ -1015,58 +1015,20 @@ function formatBytes($size, $precision = 2) {
}
return round(pow(1024, $base - floor($base)), $precision) . $suffixes[floor($base)];
}
function update_sogo_static_view($mailbox = null) {
function update_sogo_static_view() {
if (getenv('SKIP_SOGO') == "y") {
return true;
}
global $pdo;
global $lang;
$mailbox_exists = false;
if ($mailbox !== null) {
// Check if the mailbox exists
$stmt = $pdo->prepare("SELECT username FROM mailbox WHERE username = :mailbox AND active = '1'");
$stmt->execute(array(':mailbox' => $mailbox));
$row = $stmt->fetch(PDO::FETCH_ASSOC);
if ($row){
$mailbox_exists = true;
}
$stmt = $pdo->query("SELECT 'OK' FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_NAME = 'sogo_view'");
$num_results = count($stmt->fetchAll(PDO::FETCH_ASSOC));
if ($num_results != 0) {
$stmt = $pdo->query("REPLACE INTO _sogo_static_view (`c_uid`, `domain`, `c_name`, `c_password`, `c_cn`, `mail`, `aliases`, `ad_aliases`, `ext_acl`, `kind`, `multiple_bookings`)
SELECT `c_uid`, `domain`, `c_name`, `c_password`, `c_cn`, `mail`, `aliases`, `ad_aliases`, `ext_acl`, `kind`, `multiple_bookings` from sogo_view");
$stmt = $pdo->query("DELETE FROM _sogo_static_view WHERE `c_uid` NOT IN (SELECT `username` FROM `mailbox` WHERE `active` = '1');");
}
$query = "REPLACE INTO _sogo_static_view (`c_uid`, `domain`, `c_name`, `c_password`, `c_cn`, `mail`, `aliases`, `ad_aliases`, `ext_acl`, `kind`, `multiple_bookings`)
SELECT
mailbox.username,
mailbox.domain,
mailbox.username,
IF(JSON_UNQUOTE(JSON_VALUE(attributes, '$.force_pw_update')) = '0',
IF(JSON_UNQUOTE(JSON_VALUE(attributes, '$.sogo_access')) = 1, password, '{SSHA256}A123A123A321A321A321B321B321B123B123B321B432F123E321123123321321'),
'{SSHA256}A123A123A321A321A321B321B321B123B123B321B432F123E321123123321321'),
mailbox.name,
mailbox.username,
IFNULL(GROUP_CONCAT(ga.aliases ORDER BY ga.aliases SEPARATOR ' '), ''),
IFNULL(gda.ad_alias, ''),
IFNULL(external_acl.send_as_acl, ''),
mailbox.kind,
mailbox.multiple_bookings
FROM
mailbox
LEFT OUTER JOIN grouped_mail_aliases ga ON ga.username REGEXP CONCAT('(^|,)', mailbox.username, '($|,)')
LEFT OUTER JOIN grouped_domain_alias_address gda ON gda.username = mailbox.username
LEFT OUTER JOIN grouped_sender_acl_external external_acl ON external_acl.username = mailbox.username
WHERE
mailbox.active = '1'";
if ($mailbox_exists) {
$query .= " AND mailbox.username = :mailbox";
$stmt = $pdo->prepare($query);
$stmt->execute(array(':mailbox' => $mailbox));
} else {
$query .= " GROUP BY mailbox.username";
$stmt = $pdo->query($query);
}
$stmt = $pdo->query("DELETE FROM _sogo_static_view WHERE `c_uid` NOT IN (SELECT `username` FROM `mailbox` WHERE `active` = '1');");
flush_memcached();
}
function edit_user_account($_data) {

View File

@@ -1264,13 +1264,11 @@ function mailbox($_action, $_type, $_data = null, $_extra = null) {
));
}
update_sogo_static_view($username);
$_SESSION['return'][] = array(
'type' => 'success',
'log' => array(__FUNCTION__, $_action, $_type, $_data_log, $_attr),
'msg' => array('mailbox_added', htmlspecialchars($username))
);
return true;
break;
case 'resource':
$domain = idn_to_ascii(strtolower(trim($_data['domain'])), 0, INTL_IDNA_VARIANT_UTS46);
@@ -3132,10 +3130,7 @@ function mailbox($_action, $_type, $_data = null, $_extra = null) {
'log' => array(__FUNCTION__, $_action, $_type, $_data_log, $_attr),
'msg' => array('mailbox_modified', $username)
);
update_sogo_static_view($username);
}
return true;
break;
case 'mailbox_templates':
if ($_SESSION['mailcow_cc_role'] != "admin") {
@@ -3965,39 +3960,6 @@ function mailbox($_action, $_type, $_data = null, $_extra = null) {
}
return $aliasdomaindata;
break;
case 'shared_aliases':
$shared_aliases = array();
$stmt = $pdo->query("SELECT `address` FROM `alias`
WHERE `goto` REGEXP ','
AND `address` NOT LIKE '@%'
AND `goto` != `address`");
$rows = $stmt->fetchAll(PDO::FETCH_ASSOC);
while($row = array_shift($rows)) {
$domain = explode("@", $row['address'])[1];
if (hasDomainAccess($_SESSION['mailcow_cc_username'], $_SESSION['mailcow_cc_role'], $domain)) {
$shared_aliases[] = $row['address'];
}
}
return $shared_aliases;
break;
case 'direct_aliases':
$direct_aliases = array();
$stmt = $pdo->query("SELECT `address` FROM `alias`
WHERE `goto` NOT LIKE '%,%'
AND `address` NOT LIKE '@%'
AND `goto` != `address`");
$rows = $stmt->fetchAll(PDO::FETCH_ASSOC);
while($row = array_shift($rows)) {
$domain = explode("@", $row['address'])[1];
if (hasDomainAccess($_SESSION['mailcow_cc_username'], $_SESSION['mailcow_cc_role'], $domain)) {
$direct_aliases[] = $row['address'];
}
}
return $direct_aliases;
break;
case 'domains':
$domains = array();
if ($_SESSION['mailcow_cc_role'] != "admin" && $_SESSION['mailcow_cc_role'] != "domainadmin") {
@@ -4930,7 +4892,14 @@ function mailbox($_action, $_type, $_data = null, $_extra = null) {
if (!empty($mailbox_details['domain']) && !empty($mailbox_details['local_part'])) {
$maildir = $mailbox_details['domain'] . '/' . $mailbox_details['local_part'];
$exec_fields = array('cmd' => 'maildir', 'task' => 'cleanup', 'maildir' => $maildir);
docker('broadcast', 'dovecot-mailcow', 'exec', $exec_fields);
$maildir_gc = json_decode(docker('post', 'dovecot-mailcow', 'exec', $exec_fields), true);
if ($maildir_gc['type'] != 'success') {
$_SESSION['return'][] = array(
'type' => 'warning',
'log' => array(__FUNCTION__, $_action, $_type, $_data_log, $_attr),
'msg' => 'Could not move maildir to garbage collector: ' . $maildir_gc['msg']
);
}
}
else {
$_SESSION['return'][] = array(
@@ -4982,10 +4951,9 @@ function mailbox($_action, $_type, $_data = null, $_extra = null) {
$stmt->execute(array(
':username' => $username
));
$stmt = $pdo->prepare("DELETE FROM `sender_acl` WHERE `logged_in_as` = :logged_in_as OR `send_as` = :send_as");
$stmt = $pdo->prepare("DELETE FROM `sender_acl` WHERE `logged_in_as` = :username");
$stmt->execute(array(
':logged_in_as' => $username,
':send_as' => $username
':username' => $username
));
// fk, better safe than sorry
$stmt = $pdo->prepare("DELETE FROM `user_acl` WHERE `username` = :username");
@@ -5085,15 +5053,12 @@ function mailbox($_action, $_type, $_data = null, $_extra = null) {
);
continue;
}
update_sogo_static_view($username);
$_SESSION['return'][] = array(
'type' => 'success',
'log' => array(__FUNCTION__, $_action, $_type, $_data_log, $_attr),
'msg' => array('mailbox_removed', htmlspecialchars($username))
);
}
return true;
break;
case 'mailbox_templates':
if ($_SESSION['mailcow_cc_role'] != "admin") {
@@ -5299,68 +5264,7 @@ function mailbox($_action, $_type, $_data = null, $_extra = null) {
}
break;
}
if ($_action != 'get' && in_array($_type, array('domain', 'alias', 'alias_domain', 'resource')) && getenv('SKIP_SOGO') != "y") {
if ($_action != 'get' && in_array($_type, array('domain', 'alias', 'alias_domain', 'mailbox', 'resource')) && getenv('SKIP_SOGO') != "y") {
update_sogo_static_view();
}
}
function mailbox_sso($_action, $_data) {
global $pdo;
switch ($_action) {
case 'check':
$token = $_data;
$stmt = $pdo->prepare("SELECT `t1`.`username` FROM `mailbox_sso` AS `t1` JOIN `mailbox` AS `t2` ON `t1`.`username` = `t2`.`username` WHERE `t1`.`token` = :token AND `t1`.`created` > DATE_SUB(NOW(), INTERVAL '30' SECOND) AND `t2`.`active` = 1;");
$stmt->execute(array(
':token' => preg_replace('/[^a-zA-Z0-9-]/', '', $token)
));
$return = $stmt->fetch(PDO::FETCH_ASSOC);
return empty($return['username']) ? false : $return['username'];
case 'issue':
if ($_SESSION['mailcow_cc_role'] != "admin") {
$_SESSION['return'][] = array(
'type' => 'danger',
'log' => array(__FUNCTION__, $_action, $_data),
'msg' => 'access_denied'
);
return false;
}
$username = $_data['username'];
$stmt = $pdo->prepare("SELECT `username` FROM `mailbox`
WHERE `username` = :username");
$stmt->execute(array(':username' => $username));
$num_results = count($stmt->fetchAll(PDO::FETCH_ASSOC));
if ($num_results < 1) {
$_SESSION['return'][] = array(
'type' => 'danger',
'log' => array(__FUNCTION__, $_action, $_data),
'msg' => array('object_doesnt_exist', htmlspecialchars($username))
);
return false;
}
$token = implode('-', array(
strtoupper(bin2hex(random_bytes(3))),
strtoupper(bin2hex(random_bytes(3))),
strtoupper(bin2hex(random_bytes(3))),
strtoupper(bin2hex(random_bytes(3))),
strtoupper(bin2hex(random_bytes(3)))
));
$stmt = $pdo->prepare("INSERT INTO `mailbox_sso` (`username`, `token`)
VALUES (:username, :token)");
$stmt->execute(array(
':username' => $username,
':token' => $token
));
// perform cleanup
$pdo->query("DELETE FROM `mailbox_sso` WHERE created < DATE_SUB(NOW(), INTERVAL '30' SECOND);");
return ['token' => $token];
break;
}
}

View File

@@ -3,7 +3,7 @@ function init_db_schema() {
try {
global $pdo;
$db_version = "15062023_2057";
$db_version = "14022023_1000";
$stmt = $pdo->query("SHOW TABLES LIKE 'versions'");
$num_results = count($stmt->fetchAll(PDO::FETCH_ASSOC));
@@ -361,19 +361,6 @@ function init_db_schema() {
),
"attr" => "ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=DYNAMIC"
),
"mailbox_sso" => array(
"cols" => array(
"username" => "VARCHAR(255) NOT NULL",
"token" => "VARCHAR(255) NOT NULL",
"created" => "DATETIME(0) NOT NULL DEFAULT NOW(0)",
),
"keys" => array(
"primary" => array(
"" => array("token", "created")
),
),
"attr" => "ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=DYNAMIC"
),
"tags_mailbox" => array(
"cols" => array(
"tag_name" => "VARCHAR(255) NOT NULL",

View File

@@ -1,20 +1,13 @@
<?php
// SSO Domain Admin
if (!empty($_GET['sso_token'])) {
// SSO Domain Admin
$username = domain_admin_sso('check', $_GET['sso_token']);
if ($username !== false) {
$_SESSION['mailcow_cc_username'] = $username;
$_SESSION['mailcow_cc_role'] = 'domainadmin';
header('Location: /mailbox');
}
// SSO Mailbox User
$username = mailbox_sso('check', $_GET['sso_token']);
if ($username !== false) {
$_SESSION['mailcow_cc_username'] = $username;
$_SESSION['mailcow_cc_role'] = 'user';
header('Location: /mailbox');
}
}
if (isset($_POST["verify_tfa_login"])) {
@@ -70,7 +63,7 @@ if (isset($_POST["login_user"]) && isset($_POST["pass_user"])) {
unset($_SESSION['index_query_string']);
if (in_array('mobileconfig', $http_parameters)) {
if (in_array('only_email', $http_parameters)) {
header("Location: /mobileconfig.php?only_email");
header("Location: /mobileconfig.php?email_only");
die();
}
header("Location: /mobileconfig.php");

View File

@@ -1,13 +1,3 @@
const LOCALE = undefined;
const DATETIME_FORMAT = {
year: "numeric",
month: "2-digit",
day: "2-digit",
hour: "2-digit",
minute: "2-digit",
second: "2-digit"
};
$(document).ready(function() {
// mailcow alert box generator
window.mailcow_alert_box = function(message, type) {

View File

@@ -117,8 +117,8 @@ jQuery(function($){
data: 'tfa_active',
defaultContent: '',
render: function (data, type) {
if(data == 1) return '<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>';
else return '<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
if(data == 1) return '<i class="bi bi-check-lg"></i>';
else return '<i class="bi bi-x-lg"></i>';
}
},
{
@@ -126,8 +126,8 @@ jQuery(function($){
data: 'active',
defaultContent: '',
render: function (data, type) {
if(data == 1) return '<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>';
else return '<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
if(data == 1) return '<i class="bi bi-check-lg"></i>';
else return '<i class="bi bi-x-lg"></i>';
}
},
{
@@ -260,8 +260,8 @@ jQuery(function($){
data: 'tfa_active',
defaultContent: '',
render: function (data, type) {
if(data == 1) return '<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>';
else return '<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
if(data == 1) return '<i class="bi bi-check-lg"></i>';
else return '<i class="bi bi-x-lg"></i>';
}
},
{
@@ -269,8 +269,8 @@ jQuery(function($){
data: 'active',
defaultContent: '',
render: function (data, type) {
if(data == 1) return '<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>';
else return '<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
if(data == 1) return '<i class="bi bi-check-lg"></i>';
else return '<i class="bi bi-x-lg"></i>';
}
},
{
@@ -337,7 +337,7 @@ jQuery(function($){
data: 'keep_spam',
defaultContent: '',
render: function(data, type){
return 'yes'==data?'<i class="bi bi-x-lg"><span class="sorting-value">yes</span></i>':'no'==data&&'<i class="bi bi-check-lg"><span class="sorting-value">no</span></i>';
return 'yes'==data?'<i class="bi bi-x-lg"></i>':'no'==data&&'<i class="bi bi-check-lg"></i>';
}
},
{
@@ -414,8 +414,8 @@ jQuery(function($){
data: 'active',
defaultContent: '',
render: function (data, type) {
if(data == 1) return '<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>';
else return '<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
if(data == 1) return '<i class="bi bi-check-lg"></i>';
else return '<i class="bi bi-x-lg"></i>';
}
},
{
@@ -492,8 +492,8 @@ jQuery(function($){
data: 'active',
defaultContent: '',
render: function (data, type) {
if(data == 1) return '<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>';
else return '<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
if(data == 1) return '<i class="bi bi-check-lg"></i>';
else return '<i class="bi bi-x-lg"></i>';
}
},
{

View File

@@ -1,3 +1,13 @@
const LOCALE = undefined;
const DATETIME_FORMAT = {
year: "numeric",
month: "2-digit",
day: "2-digit",
hour: "2-digit",
minute: "2-digit",
second: "2-digit"
};
$(document).ready(function() {
// Parse seconds ago to date
// Get "now" timestamp
@@ -33,7 +43,7 @@ $(document).ready(function() {
if (mailcow_info.branch === "master"){
check_update(mailcow_info.version_tag, mailcow_info.project_url);
}
$("#mailcow_version").click(function(){
$("#maiclow_version").click(function(){
if (mailcow_cc_role !== "admin" && mailcow_cc_role !== "domainadmin" || mailcow_info.branch !== "master")
return;
@@ -819,10 +829,13 @@ jQuery(function($){
url: '/api/v1/get/rspamd/actions',
async: true,
success: function(data){
console.log(data);
var total = 0;
$(data).map(function(){total += this[1];});
var labels = $.makeArray($(data).map(function(){return this[0] + ' ' + Math.round(this[1]/total * 100) + '%';}));
var values = $.makeArray($(data).map(function(){return this[1];}));
console.log(values);
var graphdata = {
labels: labels,
@@ -938,15 +951,12 @@ jQuery(function($){
title: 'Score',
data: 'score',
defaultContent: '',
class: 'text-nowrap',
createdCell: function(td, cellData) {
$(td).attr({
"data-order": cellData.sortBy,
"data-sort": cellData.sortBy
});
},
render: function (data) {
return data.value;
$(td).html(cellData.value);
}
},
{
@@ -969,9 +979,7 @@ jQuery(function($){
"data-order": cellData.sortBy,
"data-sort": cellData.sortBy
});
},
render: function (data) {
return data.value;
$(td).html(cellData.value);
}
},
{
@@ -1173,7 +1181,7 @@ jQuery(function($){
if (table = $('#' + log_table).DataTable()) {
var heading = $('#' + log_table).closest('.card').find('.card-header');
var load_rows = (table.data().length + 1) + '-' + (table.data().length + new_nrows)
var load_rows = (table.page.len() + 1) + '-' + (table.page.len() + new_nrows)
$.get('/api/v1/get/logs/' + log_url + '/' + load_rows).then(function(data){
if (data.length === undefined) { mailcow_alert_box(lang.no_new_rows, "info"); return; }
@@ -1294,12 +1302,6 @@ function update_stats(timeout=5){
$("#host_cpu_usage").text(parseInt(data.cpu.usage).toString() + "%");
$("#host_memory_total").text((data.memory.total / (1024 ** 3)).toFixed(2).toString() + "GB");
$("#host_memory_usage").text(parseInt(data.memory.usage).toString() + "%");
if (data.architecture == "aarch64"){
$("#host_architecture").html('<span data-bs-toggle="tooltip" data-bs-placement="top" title="' + lang_debug.wip +'">' + data.architecture + ' ⚠️</span>');
}
else {
$("#host_architecture").html(data.architecture);
}
// update cpu and mem chart
var cpu_chart = Chart.getChart("host_cpu_chart");

View File

@@ -607,7 +607,7 @@ jQuery(function($){
defaultContent: '',
responsivePriority: 6,
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':(0==data?'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>':2==data&&'&#8212;');
return 1==data?'<i class="bi bi-check-lg"></i>':(0==data?'<i class="bi bi-x-lg"></i>':2==data&&'&#8212;');
}
},
{
@@ -754,7 +754,7 @@ jQuery(function($){
data: 'attributes.gal',
defaultContent: '',
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -762,7 +762,7 @@ jQuery(function($){
data: 'attributes.backupmx',
defaultContent: '',
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -770,7 +770,7 @@ jQuery(function($){
data: 'attributes.relay_all_recipients',
defaultContent: '',
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -778,7 +778,7 @@ jQuery(function($){
data: 'attributes.relay_unknown_only',
defaultContent: '',
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -787,7 +787,7 @@ jQuery(function($){
defaultContent: '',
responsivePriority: 4,
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -1093,7 +1093,7 @@ jQuery(function($){
defaultContent: '',
responsivePriority: 4,
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':(0==data?'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>':2==data&&'&#8212;');
return 1==data?'<i class="bi bi-check-lg"></i>':(0==data?'<i class="bi bi-x-lg"></i>':2==data&&'&#8212;');
}
},
{
@@ -1164,13 +1164,13 @@ jQuery(function($){
item.attributes.quota = humanFileSize(item.attributes.quota);
item.attributes.tls_enforce_in = '<i class="text-' + (item.attributes.tls_enforce_in == 1 ? 'success bi bi-lock-fill' : 'danger bi bi-unlock-fill') + '"><span class="sorting-value">' + (item.attributes.tls_enforce_in == 1 ? '1' : '0') + '</span></i>';
item.attributes.tls_enforce_out = '<i class="text-' + (item.attributes.tls_enforce_out == 1 ? 'success bi bi-lock-fill' : 'danger bi bi-unlock-fill') + '"><span class="sorting-value">' + (item.attributes.tls_enforce_out == 1 ? '1' : '0') + '</span></i>';
item.attributes.pop3_access = '<i class="text-' + (item.attributes.pop3_access == 1 ? 'success' : 'danger') + ' bi bi-' + (item.attributes.pop3_access == 1 ? 'check-lg' : 'x-lg') + '"><span class="sorting-value">' + (item.attributes.pop3_access == 1 ? '1' : '0') + '</span></i>';
item.attributes.imap_access = '<i class="text-' + (item.attributes.imap_access == 1 ? 'success' : 'danger') + ' bi bi-' + (item.attributes.imap_access == 1 ? 'check-lg' : 'x-lg') + '"><span class="sorting-value">' + (item.attributes.imap_access == 1 ? '1' : '0') + '</span></i>';
item.attributes.smtp_access = '<i class="text-' + (item.attributes.smtp_access == 1 ? 'success' : 'danger') + ' bi bi-' + (item.attributes.smtp_access == 1 ? 'check-lg' : 'x-lg') + '"><span class="sorting-value">' + (item.attributes.smtp_access == 1 ? '1' : '0') + '</span></i>';
item.attributes.sieve_access = '<i class="text-' + (item.attributes.sieve_access == 1 ? 'success' : 'danger') + ' bi bi-' + (item.attributes.sieve_access == 1 ? 'check-lg' : 'x-lg') + '"><span class="sorting-value">' + (item.attributes.sieve_access == 1 ? '1' : '0') + '</span></i>';
item.attributes.sogo_access = '<i class="text-' + (item.attributes.sogo_access == 1 ? 'success' : 'danger') + ' bi bi-' + (item.attributes.sogo_access == 1 ? 'check-lg' : 'x-lg') + '"><span class="sorting-value">' + (item.attributes.sogo_access == 1 ? '1' : '0') + '</span></i>';
item.attributes.tls_enforce_in = '<i class="text-' + (item.attributes.tls_enforce_in == 1 ? 'success bi bi-lock-fill' : 'danger bi bi-unlock-fill') + '"></i>';
item.attributes.tls_enforce_out = '<i class="text-' + (item.attributes.tls_enforce_out == 1 ? 'success bi bi-lock-fill' : 'danger bi bi-unlock-fill') + '"></i>';
item.attributes.pop3_access = '<i class="text-' + (item.attributes.pop3_access == 1 ? 'success' : 'danger') + ' bi bi-' + (item.attributes.pop3_access == 1 ? 'check-lg' : 'x-lg') + '"></i>';
item.attributes.imap_access = '<i class="text-' + (item.attributes.imap_access == 1 ? 'success' : 'danger') + ' bi bi-' + (item.attributes.imap_access == 1 ? 'check-lg' : 'x-lg') + '"></i>';
item.attributes.smtp_access = '<i class="text-' + (item.attributes.smtp_access == 1 ? 'success' : 'danger') + ' bi bi-' + (item.attributes.smtp_access == 1 ? 'check-lg' : 'x-lg') + '"></i>';
item.attributes.sieve_access = '<i class="text-' + (item.attributes.sieve_access == 1 ? 'success' : 'danger') + ' bi bi-' + (item.attributes.sieve_access == 1 ? 'check-lg' : 'x-lg') + '"></i>';
item.attributes.sogo_access = '<i class="text-' + (item.attributes.sogo_access == 1 ? 'success' : 'danger') + ' bi bi-' + (item.attributes.sogo_access == 1 ? 'check-lg' : 'x-lg') + '"></i>';
if (item.attributes.quarantine_notification === 'never') {
item.attributes.quarantine_notification = lang.never;
} else if (item.attributes.quarantine_notification === 'hourly') {
@@ -1188,6 +1188,7 @@ jQuery(function($){
item.attributes.quarantine_category = lang.q_all;
}
if (item.template.toLowerCase() == "default"){
item.action = '<div class="btn-group">' +
'<a href="/edit/template/' + encodeURIComponent(item.id) + '" class="btn btn-xs btn-xs-half btn-secondary"><i class="bi bi-pencil-fill"></i> ' + lang.edit + '</a>' +
@@ -1328,7 +1329,7 @@ jQuery(function($){
defaultContent: '',
responsivePriority: 4,
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':(0==data?'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>':2==data&&'&#8212;');
return 1==data?'<i class="bi bi-check-lg"></i>':(0==data?'<i class="bi bi-x-lg"></i>':2==data&&'&#8212;');
}
},
{
@@ -1439,7 +1440,7 @@ jQuery(function($){
data: 'active',
defaultContent: '',
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':(0==data?'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>':2==data&&'&#8212;');
return 1==data?'<i class="bi bi-check-lg"></i>':(0==data?'<i class="bi bi-x-lg"></i>':2==data&&'&#8212;');
}
},
{
@@ -1458,37 +1459,30 @@ jQuery(function($){
}
function draw_bcc_table() {
$.get("/api/v1/get/bcc-destination-options", function(data){
var optgroup = "";
// Domains
if (data.domains && data.domains.length > 0) {
optgroup = "<optgroup label='" + lang.domains + "'>";
$.each(data.domains, function(index, domain){
optgroup += "<option value='" + domain + "'>" + domain + "</option>";
var optgroup = "<optgroup label='" + lang.domains + "'>";
$.each(data.domains, function(index, domain){
optgroup += "<option value='" + domain + "'>" + domain + "</option>";
});
optgroup += "</optgroup>";
$('#bcc-local-dest').append(optgroup);
// Alias domains
var optgroup = "<optgroup label='" + lang.domain_aliases + "'>";
$.each(data.alias_domains, function(index, alias_domain){
optgroup += "<option value='" + alias_domain + "'>" + alias_domain + "</option>";
});
optgroup += "</optgroup>"
$('#bcc-local-dest').append(optgroup);
// Mailboxes and aliases
$.each(data.mailboxes, function(mailbox, aliases){
var optgroup = "<optgroup label='" + mailbox + "'>";
$.each(aliases, function(index, alias){
optgroup += "<option value='" + alias + "'>" + alias + "</option>";
});
optgroup += "</optgroup>";
$('#bcc-local-dest').append(optgroup);
}
// Alias domains
if (data.alias_domains && data.alias_domains.length > 0) {
optgroup = "<optgroup label='" + lang.domain_aliases + "'>";
$.each(data.alias_domains, function(index, alias_domain){
optgroup += "<option value='" + alias_domain + "'>" + alias_domain + "</option>";
});
optgroup += "</optgroup>"
$('#bcc-local-dest').append(optgroup);
}
// Mailboxes and aliases
if (data.mailboxes && Object.keys(data.mailboxes).length > 0) {
$.each(data.mailboxes, function(mailbox, aliases){
optgroup = "<optgroup label='" + mailbox + "'>";
$.each(aliases, function(index, alias){
optgroup += "<option value='" + alias + "'>" + alias + "</option>";
});
optgroup += "</optgroup>";
$('#bcc-local-dest').append(optgroup);
});
}
// Recreate picker
});
// Finish
$('#bcc-local-dest').selectpicker('refresh');
});
@@ -1584,7 +1578,7 @@ jQuery(function($){
data: 'active',
defaultContent: '',
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':(0==data?'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>':2==data&&'&#8212;');
return 1==data?'<i class="bi bi-check-lg"></i>':(0==data?'<i class="bi bi-x-lg"></i>':2==data&&'&#8212;');
}
},
{
@@ -1681,7 +1675,7 @@ jQuery(function($){
data: 'active',
defaultContent: '',
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':0==data&&'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':0==data&&'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -1788,7 +1782,7 @@ jQuery(function($){
data: 'active',
defaultContent: '',
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':0==data&&'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':0==data&&'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -1923,7 +1917,7 @@ jQuery(function($){
data: 'sogo_visible',
defaultContent: '',
render: function(data, type){
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':0==data&&'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':0==data&&'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -1942,7 +1936,7 @@ jQuery(function($){
defaultContent: '',
responsivePriority: 6,
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':0==data&&'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':0==data&&'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -1958,10 +1952,6 @@ jQuery(function($){
table.on('responsive-resize', function (e, datatable, columns){
hideTableExpandCollapseBtn('#tab-mbox-aliases', '#alias_table');
});
table.on( 'draw', function (){
$('#alias_table [data-bs-toggle="tooltip"]').tooltip();
});
}
function draw_aliasdomain_table() {
// just recalc width if instance already exists
@@ -2041,7 +2031,7 @@ jQuery(function($){
data: 'active',
defaultContent: '',
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':0==data&&'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':0==data&&'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -2177,7 +2167,7 @@ jQuery(function($){
data: 'active',
defaultContent: '',
render: function (data, type) {
return 1==data?'<i class="bi bi-check-lg"><span class="sorting-value">1</span></i>':0==data&&'<i class="bi bi-x-lg"><span class="sorting-value">0</span></i>';
return 1==data?'<i class="bi bi-check-lg"></i>':0==data&&'<i class="bi bi-x-lg"></i>';
}
},
{
@@ -2333,19 +2323,16 @@ jQuery(function($){
// detect element visibility changes
function onVisible(element, callback) {
$(document).ready(function() {
let element_object = document.querySelector(element);
element_object = document.querySelector(element);
if (element_object === null) return;
let observer = new IntersectionObserver((entries, observer) => {
new IntersectionObserver((entries, observer) => {
entries.forEach(entry => {
if(entry.intersectionRatio > 0) {
callback(element_object);
observer.unobserve(element_object);
}
});
})
observer.observe(element_object);
}).observe(element_object);
});
}

View File

@@ -127,20 +127,6 @@ jQuery(function($){
}
}
function createSortableDate(td, cellData, date_string = false) {
if (date_string)
var date = new Date(cellData);
else
var date = new Date(cellData ? cellData * 1000 : 0);
var timestamp = date.getTime();
$(td).attr({
"data-order": timestamp,
"data-sort": timestamp
});
$(td).html(date.toLocaleDateString(LOCALE, DATETIME_FORMAT));
}
function draw_tla_table() {
// just recalc width if instance already exists
if ($.fn.DataTable.isDataTable('#tla_table') ) {
@@ -158,7 +144,6 @@ jQuery(function($){
"tr" +
"<'row'<'col-sm-12 col-md-5'i><'col-sm-12 col-md-7'p>>",
language: lang_datatables,
order: [[4, 'desc']],
ajax: {
type: "GET",
url: "/api/v1/get/time_limited_aliases",
@@ -206,16 +191,18 @@ jQuery(function($){
title: lang.alias_valid_until,
data: 'validity',
defaultContent: '',
createdCell: function(td, cellData) {
createSortableDate(td, cellData)
render: function (data, type) {
var date = new Date(data ? data * 1000 : 0);
return date.toLocaleDateString(undefined, {year: "numeric", month: "2-digit", day: "2-digit", hour: "2-digit", minute: "2-digit", second: "2-digit"});
}
},
{
title: lang.created_on,
data: 'created',
defaultContent: '',
createdCell: function(td, cellData) {
createSortableDate(td, cellData, true)
render: function (data, type) {
var date = new Date(data.replace(/-/g, "/"));
return date.toLocaleDateString(undefined, {year: "numeric", month: "2-digit", day: "2-digit", hour: "2-digit", minute: "2-digit", second: "2-digit"});
}
},
{

View File

@@ -288,26 +288,18 @@ if (isset($_GET['query'])) {
case "domain-admin":
process_add_return(domain_admin('add', $attr));
break;
case "sso":
switch ($object) {
case "domain-admin":
$data = domain_admin_sso('issue', $attr);
if($data) {
echo json_encode($data);
exit(0);
}
process_add_return($data);
break;
case "mailbox":
$data = mailbox_sso('issue', $attr);
if($data) {
echo json_encode($data);
exit(0);
}
process_add_return($data);
break;
}
break;
case "sso":
switch ($object) {
case "domain-admin":
$data = domain_admin_sso('issue', $attr);
if($data) {
echo json_encode($data);
exit(0);
}
process_add_return($data);
break;
}
break;
case "admin":
process_add_return(admin('add', $attr));
break;

View File

@@ -105,8 +105,7 @@
"timeout2": "Časový limit pro připojení k lokálnímu serveru",
"username": "Uživatelské jméno",
"validate": "Ověřit",
"validation_success": "Úspěšně ověřeno",
"tags": "Štítky"
"validation_success": "Úspěšně ověřeno"
},
"admin": {
"access": "Přístupy",
@@ -334,11 +333,7 @@
"username": "Uživatelské jméno",
"validate_license_now": "Ověřit GUID na licenčním serveru",
"verify": "Ověřit",
"yes": "&#10003;",
"f2b_ban_time_increment": "Délka banu je prodlužována s každým dalším banem",
"f2b_max_ban_time": "Maximální délka banu (s)",
"ip_check": "Kontrola IP",
"ip_check_disabled": "Kontrola IP je vypnuta. Můžete ji zapnout v <br> <strong>System > Nastavení > Options > Přizpůsobení</strong>"
"yes": "&#10003;"
},
"danger": {
"access_denied": "Přístup odepřen nebo jsou neplatná data ve formuláři",
@@ -541,7 +536,7 @@
"inactive": "Neaktivní",
"kind": "Druh",
"last_modified": "Naposledy změněn",
"lookup_mx": "Cíl je regulární výraz který se shoduje s MX záznamem (<code>.*\\.google\\.com</code> směřuje veškerou poštu na MX které jsou cílem pro google.com přes tento skok)",
"lookup_mx": "Cíl je regulární výraz který se shoduje s MX záznamem (<code>.*google\\.com</code> směřuje veškerou poštu na MX které jsou cílem pro google.com přes tento skok)",
"mailbox": "Úprava mailové schránky",
"mailbox_quota_def": "Výchozí kvóta schránky",
"mailbox_relayhost_info": "Aplikované jen na uživatelskou schránku a přímé aliasy, přepisuje předávající server domény.",

View File

@@ -4,15 +4,15 @@
"app_passwds": "Administrer app-adgangskoder",
"bcc_maps": "BCC kort",
"delimiter_action": "Afgrænsning handling",
"eas_reset": "Nulstil EAS enheder",
"eas_reset": "Nulstil EAS endheder",
"extend_sender_acl": "Tillad at udvide afsenderens ACL med eksterne adresser",
"filters": "Filtre",
"login_as": "Login som mailboks bruger",
"prohibited": "Nægtet af ACL",
"protocol_access": "Skift protokol adgang",
"prohibited": "Forbudt af ACL",
"protocol_access": "Ændre protokol adgang",
"pushover": "Pushover",
"quarantine": "Karantænehandlinger",
"quarantine_attachments": "Karantænevedhæftede filer",
"quarantine": "Karantæneaktioner",
"quarantine_attachments": "Karantæne vedhæftede filer",
"quarantine_notification": "Skift karantænemeddelelser",
"ratelimit": "Satsgrænse",
"recipient_maps": "Modtagerkort",
@@ -20,15 +20,12 @@
"sogo_access": "Tillad styring af SOGo-adgang",
"sogo_profile_reset": "Nulstil SOGo-profil",
"spam_alias": "Midlertidige aliasser",
"spam_policy": "Sortliste/hvidliste",
"spam_policy": "Sortliste / hvidliste",
"spam_score": "Spam-score",
"syncjobs": "Synkroniserings job",
"tls_policy": "TLS politik",
"unlimited_quota": "Ubegrænset plads for mailbokse",
"domain_desc": "Skift domæne beskrivelse",
"domain_relayhost": "Skift relæ host for et domæne",
"mailbox_relayhost": "Skift relæ-host for en postkasse",
"quarantine_category": "Skift kategorien for karantænemeddelelse"
"domain_desc": "Skift domæne beskrivelse"
},
"add": {
"activate_filter_warn": "Alle andre filtre deaktiveres, når aktiv er markeret.",
@@ -62,7 +59,7 @@
"gal": "Global adresseliste",
"gal_info": "GAL indeholder alle objekter i et domæne og kan ikke redigeres af nogen bruger. Information om ledig / optaget i SOGo mangler, hvis deaktiveret! <b> Genstart SOGo for at anvende ændringer. </b>",
"generate": "generere",
"goto_ham": "Lær som <span class=\"text-success\"><b>ønsket</b></span>",
"goto_ham": "Lær som <span class=\"text-success\"><b>ham</b></span>",
"goto_null": "Kassér e-mail i stilhed",
"goto_spam": "Lær som <span class=\"text-danger\"><b>spam</b></span>",
"hostname": "Vært",
@@ -83,7 +80,7 @@
"private_comment": "Privat kommentar",
"public_comment": "Offentlig kommentar",
"quota_mb": "Kvota (Mb)",
"relay_all": "Besvar alle modtager",
"relay_all": "Send alle modtagere videre",
"relay_all_info": "↪ Hvis du vælger <b> ikke </b> at videresende alle modtagere, skal du tilføje et (\"blind\") postkasse til hver enkelt modtager, der skal videresendes.",
"relay_domain": "Send dette domæne videre",
"relay_transport_info": "<div class=\"badge fs-6 bg-info\">Info</div> Du kan definere transportkort til en tilpasset destination for dette domæne. Hvis ikke indstillet, foretages der et MX-opslag.",
@@ -104,10 +101,7 @@
"timeout2": "Timeout for forbindelse til lokal vært",
"username": "Brugernavn",
"validate": "Bekræft",
"validation_success": "Valideret med succes",
"bcc_dest_format": "BCC-destination skal være en enkelt gyldig e-mail-adresse.<br>Hvis du har brug for at sende en kopi til flere adresser, kan du oprette et alias og bruge det her.",
"app_passwd_protocols": "Tilladte protokoller for app adgangskode",
"tags": "Tag's"
"validation_success": "Valideret med succes"
},
"admin": {
"access": "Adgang",
@@ -314,10 +308,7 @@
"username": "Brugernavn",
"validate_license_now": "Valider GUID mod licensserver",
"verify": "Verificere",
"yes": "&#10003;",
"ip_check_opt_in": "Opt-In for brug af tredjepartstjeneste <strong>ipv4.mailcow.email</strong> og <strong>ipv6.mailcow.email</strong> til at finde eksterne IP-adresser.",
"queue_unban": "unban",
"admins": "Administratorer"
"yes": "&#10003;"
},
"danger": {
"access_denied": "Adgang nægtet eller ugyldig formular data",
@@ -434,8 +425,7 @@
"username_invalid": "Brugernavn %s kan ikke bruges",
"validity_missing": "Tildel venligst en gyldighedsperiode",
"value_missing": "Angiv alle værdier",
"yotp_verification_failed": "Yubico OTP verifikationen mislykkedes: %s",
"webauthn_publickey_failed": "Der er ikke gemt nogen offentlig nøgle for den valgte autentifikator"
"yotp_verification_failed": "Yubico OTP verifikationen mislykkedes: %s"
},
"debug": {
"chart_this_server": "Diagram (denne server)",
@@ -452,8 +442,7 @@
"solr_status": "Solr-status",
"started_on": "Startede den",
"static_logs": "Statiske logfiler",
"system_containers": "System og Beholdere",
"error_show_ip": "Kunne ikke finde de offentlige IP-adresser"
"system_containers": "System og Beholdere"
},
"diagnostics": {
"cname_from_a": "Værdi afledt af A / AAAA-post. Dette understøttes, så længe posten peger på den korrekte ressource.",
@@ -564,11 +553,7 @@
"title": "Rediger objekt",
"unchanged_if_empty": "Lad være tomt, hvis uændret",
"username": "Brugernavn",
"validate_save": "Valider og gem",
"admin": "Rediger administrator",
"lookup_mx": "Destination er et regulært udtryk, der matcher MX-navnet (<code>.*google\\.dk</code> for at dirigere al e-mail, der er målrettet til en MX, der ender på google.dk, over dette hop)",
"mailbox_relayhost_info": "Anvendt på postkassen og kun direkte aliasser, og overskriver et domæne relæ-host.",
"quota_warning_bcc": "Kvoteadvarsel BCC"
"validate_save": "Valider og gem"
},
"footer": {
"cancel": "Afbestille",
@@ -586,7 +571,7 @@
"header": {
"administration": "Konfiguration og detailer",
"apps": "Apps",
"debug": "Information",
"debug": "Systemoplysninger",
"email": "E-Mail",
"mailcow_config": "Konfiguration",
"quarantine": "Karantæne",
@@ -754,10 +739,7 @@
"username": "Brugernavn",
"waiting": "Venter",
"weekly": "Ugentlig",
"yes": "&#10003;",
"goto_ham": "Lær som <b>ønsket</b>",
"catch_all": "Fang-alt",
"open_logs": "Åben logfiler"
"yes": "&#10003;"
},
"oauth2": {
"access_denied": "Log ind som mailboks ejer for at give adgang via OAuth2.",
@@ -1048,7 +1030,7 @@
"spamfilter_table_empty": "Intet data at vise",
"spamfilter_table_remove": "slet",
"spamfilter_table_rule": "Regl",
"spamfilter_wl": "Hvidliste",
"spamfilter_wl": "Hvisliste",
"spamfilter_wl_desc": "Hvidlistede e-mail-adresser til <b>aldrig</b> at klassificeres som spam. Wildcards kan bruges. Et filter anvendes kun på direkte aliaser (aliaser med en enkelt målpostkasse) eksklusive catch-aliaser og selve en postkasse.",
"spamfilter_yellow": "Gul: denne besked kan være spam, vil blive tagget som spam og flyttes til din junk-mappe",
"status": "Status",
@@ -1084,11 +1066,5 @@
"quota_exceeded_scope": "Domænekvote overskredet: Kun ubegrænsede postkasser kan oprettes i dette domæneomfang.",
"session_token": "Form nøgle ugyldig: Nøgle passer ikke",
"session_ua": "Form nøgle ugyldig: Bruger-Agent gyldighedskontrols fejl"
},
"datatables": {
"lengthMenu": "Vis _MENU_ poster",
"paginate": {
"first": "Først"
}
}
}

View File

@@ -175,12 +175,10 @@
"empty": "Keine Einträge vorhanden",
"excludes": "Diese Empfänger ausschließen",
"f2b_ban_time": "Bannzeit in Sekunden",
"f2b_ban_time_increment": "Bannzeit erhöht sich mit jedem Bann",
"f2b_blacklist": "Blacklist für Netzwerke und Hosts",
"f2b_filter": "Regex-Filter",
"f2b_list_info": "Ein Host oder Netzwerk auf der Blacklist wird immer eine Whitelist-Einheit überwiegen. <b>Die Aktualisierung der Liste dauert einige Sekunden.</b>",
"f2b_max_attempts": "Max. Versuche",
"f2b_max_ban_time": "Maximale Bannzeit in Sekunden",
"f2b_netban_ipv4": "Netzbereich für IPv4-Banns (8-32)",
"f2b_netban_ipv6": "Netzbereich für IPv6-Banns (8-128)",
"f2b_parameters": "Fail2ban-Parameter",
@@ -216,7 +214,7 @@
"loading": "Bitte warten...",
"login_time": "Zeit",
"logo_info": "Die hochgeladene Grafik wird für die Navigationsleiste auf eine Höhe von 40px skaliert. Für die Darstellung auf der Login-Maske beträgt die skalierte Breite maximal 250px. Eine frei skalierbare Grafik (etwa SVG) wird empfohlen.",
"lookup_mx": "Ziel mit MX vergleichen (Regex, etwa <code>.*\\.google\\.com</code>, um alle Ziele mit MX *google.com zu routen)",
"lookup_mx": "Ziel mit MX vergleichen (Regex, etwa <code>.*google\\.com</code>, um alle Ziele mit MX *google.com zu routen)",
"main_name": "\"mailcow UI\" Name",
"merged_vars_hint": "Ausgegraute Reihen wurden aus der Datei <code>vars.(local.)inc.php</code> gelesen und können hier nicht verändert werden.",
"message": "Nachricht",
@@ -498,7 +496,6 @@
}
},
"debug": {
"architecture": "Architektur",
"chart_this_server": "Chart (dieser Server)",
"containers_info": "Container-Information",
"container_running": "Läuft",
@@ -535,8 +532,7 @@
"update_available": "Es ist ein Update verfügbar",
"no_update_available": "Das System ist auf aktuellem Stand",
"update_failed": "Es konnte nicht nach einem Update gesucht werden",
"username": "Benutzername",
"wip": "Aktuell noch in Arbeit"
"username": "Benutzername"
},
"diagnostics": {
"cname_from_a": "Wert abgeleitet von A/AAAA-Eintrag. Wird unterstützt, sofern der Eintrag auf die korrekte Ressource zeigt.",
@@ -595,7 +591,7 @@
"inactive": "Inaktiv",
"kind": "Art",
"last_modified": "Zuletzt geändert",
"lookup_mx": "Ziel mit MX vergleichen (Regex, etwa <code>.*\\.google\\.com</code>, um alle Ziele mit MX *google.com zu routen)",
"lookup_mx": "Ziel mit MX vergleichen (Regex, etwa <code>.*google\\.com</code>, um alle Ziele mit MX *google.com zu routen)",
"mailbox": "Mailbox bearbeiten",
"mailbox_quota_def": "Standard-Quota einer Mailbox",
"mailbox_relayhost_info": "Wird auf eine Mailbox und direkte Alias-Adressen angewendet. Überschreibt die Einstellung einer Domain.",

View File

@@ -177,12 +177,10 @@
"empty": "No results",
"excludes": "Excludes these recipients",
"f2b_ban_time": "Ban time (s)",
"f2b_ban_time_increment": "Ban time is incremented with each ban",
"f2b_blacklist": "Blacklisted networks/hosts",
"f2b_filter": "Regex filters",
"f2b_list_info": "A blacklisted host or network will always outweigh a whitelist entity. <b>List updates will take a few seconds to be applied.</b>",
"f2b_max_attempts": "Max. attempts",
"f2b_max_ban_time": "Max. ban time (s)",
"f2b_netban_ipv4": "IPv4 subnet size to apply ban on (8-32)",
"f2b_netban_ipv6": "IPv6 subnet size to apply ban on (8-128)",
"f2b_parameters": "Fail2ban parameters",
@@ -218,7 +216,7 @@
"loading": "Please wait...",
"login_time": "Login time",
"logo_info": "Your image will be scaled to a height of 40px for the top navigation bar and a max. width of 250px for the start page. A scalable graphic is highly recommended.",
"lookup_mx": "Destination is a regular expression to match against MX name (<code>.*\\.google\\.com</code> to route all mail targeted to a MX ending in google.com over this hop)",
"lookup_mx": "Destination is a regular expression to match against MX name (<code>.*google\\.com</code> to route all mail targeted to a MX ending in google.com over this hop)",
"main_name": "\"mailcow UI\" name",
"merged_vars_hint": "Greyed out rows were merged from <code>vars.(local.)inc.php</code> and cannot be modified.",
"message": "Message",
@@ -498,7 +496,6 @@
}
},
"debug": {
"architecture": "Architecture",
"chart_this_server": "Chart (this server)",
"containers_info": "Container information",
"container_running": "Running",
@@ -535,8 +532,7 @@
"update_available": "There is an update available",
"no_update_available": "The System is on the latest version",
"update_failed": "Could not check for an Update",
"username": "Username",
"wip": "Currently Work in Progress"
"username": "Username"
},
"diagnostics": {
"cname_from_a": "Value derived from A/AAAA record. This is supported as long as the record points to the correct resource.",
@@ -595,7 +591,7 @@
"inactive": "Inactive",
"kind": "Kind",
"last_modified": "Last modified",
"lookup_mx": "Destination is a regular expression to match against MX name (<code>.*\\.google\\.com</code> to route all mail targeted to a MX ending in google.com over this hop)",
"lookup_mx": "Destination is a regular expression to match against MX name (<code>.*google\\.com</code> to route all mail targeted to a MX ending in google.com over this hop)",
"mailbox": "Edit mailbox",
"mailbox_quota_def": "Default mailbox quota",
"mailbox_relayhost_info": "Applied to the mailbox and direct aliases only, does override a domain relayhost.",

View File

@@ -141,11 +141,9 @@
"empty": "Sin resultados",
"excludes": "Excluye a estos destinatarios",
"f2b_ban_time": "Tiempo de restricción (s)",
"f2b_ban_time_increment": "Tiempo de restricción se incrementa con cada restricción",
"f2b_blacklist": "Redes y hosts en lista negra",
"f2b_list_info": "Un host o red en lista negra siempre superará a una entidad de la lista blanca. <b>Las actualizaciones de la lista tardarán unos segundos en aplicarse.</b>",
"f2b_max_attempts": "Max num. de intentos",
"f2b_max_ban_time": "Max tiempo de restricción (s)",
"f2b_netban_ipv4": "Tamaño de subred IPv4 para aplicar la restricción (8-32)",
"f2b_netban_ipv6": "Tamaño de subred IPv6 para aplicar la restricción (8-128)",
"f2b_parameters": "Parametros Fail2ban",

View File

@@ -24,7 +24,7 @@
"spam_policy": "Liste Noire/Liste Blanche",
"spam_score": "Score SPAM",
"syncjobs": "Tâches de synchronisation",
"tls_policy": "Politique TLS",
"tls_policy": "Police TLS",
"unlimited_quota": "Quota illimité pour les boites de courriel",
"domain_desc": "Modifier la description du domaine",
"domain_relayhost": "Changer le relais pour un domaine",
@@ -106,8 +106,7 @@
"validate": "Valider",
"validation_success": "Validation réussie",
"bcc_dest_format": "La destination Cci doit être une seule adresse e-mail valide.<br>Si vous avez besoin d'envoyer une copie à plusieurs adresses, créez un alias et utilisez-le ici.",
"tags": "Etiquettes",
"app_passwd_protocols": "Protocoles autorisés pour le mot de passe de l'application"
"tags": "Etiquettes"
},
"admin": {
"access": "Accès",
@@ -172,13 +171,11 @@
"edit": "Editer",
"empty": "Aucun résultat",
"excludes": "Exclure ces destinataires",
"f2b_ban_time": "Durée du bannissement (s)",
"f2b_ban_time_increment": "Durée du bannissement est augmentée à chaque bannissement",
"f2b_ban_time": "Durée du bannissement(s)",
"f2b_blacklist": "Réseaux/Domaines sur Liste Noire",
"f2b_filter": "Filtre(s) Regex",
"f2b_list_info": "Un hôte ou un réseau sur liste noire l'emportera toujours sur une entité de liste blanche. <b>L'application des mises à jour de liste prendra quelques secondes.</b>",
"f2b_max_attempts": "Nb max. de tentatives",
"f2b_max_ban_time": "Max. durée du bannissement (s)",
"f2b_netban_ipv4": "Taille du sous-réseau IPv4 pour l'application du bannissement (8-32)",
"f2b_netban_ipv6": "Taille du sous-réseau IPv6 pour l'application du bannissement (8-128)",
"f2b_parameters": "Paramètres Fail2ban",
@@ -324,9 +321,7 @@
"admins": "Administrateurs",
"api_read_only": "Accès lecture-seule",
"password_policy_lowerupper": "Doit contenir des caractères minuscules et majuscules",
"password_policy_numbers": "Doit contenir au moins un chiffre",
"ip_check": "Vérification IP",
"ip_check_disabled": "La vérification IP est désactivée. Vous pouvez l'activer sous<br> <strong>Système > Configuration > Options > Personnaliser</strong>"
"password_policy_numbers": "Doit contenir au moins un chiffre"
},
"danger": {
"access_denied": "Accès refusé ou données de formulaire non valides",
@@ -445,12 +440,7 @@
"username_invalid": "Le nom d'utilisateur %s ne peut pas être utilisé",
"validity_missing": "Veuillez attribuer une période de validité",
"value_missing": "Veuillez fournir toutes les valeurs",
"yotp_verification_failed": "La vérification Yubico OTP a échoué : %s",
"webauthn_authenticator_failed": "L'authentificateur selectionné est introuvable",
"demo_mode_enabled": "Le mode de démonstration est activé",
"template_exists": "La template %s existe déja",
"template_id_invalid": "Le numéro de template %s est invalide",
"template_name_invalid": "Le nom de la template est invalide"
"yotp_verification_failed": "La vérification Yubico OTP a échoué : %s"
},
"debug": {
"chart_this_server": "Graphique (ce serveur)",
@@ -588,7 +578,7 @@
"unchanged_if_empty": "Si non modifié, laisser en blanc",
"username": "Nom d'utilisateur",
"validate_save": "Valider et sauver",
"lookup_mx": "La destination est une expression régulière qui doit correspondre avec le nom du MX (<code>.*\\.google\\.com</code> pour acheminer tout le courrier destiné à un MX se terminant par google.com via ce saut)",
"lookup_mx": "La destination est une expression régulière qui doit correspondre avec le nom du MX (<code>.*google\\.com</code> pour acheminer tout le courrier destiné à un MX se terminant par google.com via ce saut).",
"mailbox_relayhost_info": "S'applique uniquement à la boîte aux lettres et aux alias directs, remplace le relayhost du domaine."
},
"footer": {
@@ -1091,12 +1081,9 @@
"username": "Nom d'utilisateur",
"verify": "Vérification",
"waiting": "En attente",
"week": "semaine",
"week": "Semaine",
"weekly": "Hebdomadaire",
"weeks": "semaines",
"months": "mois",
"year": "année",
"years": "années"
"weeks": "semaines"
},
"warning": {
"cannot_delete_self": "Impossible de supprimer lutilisateur connecté",

View File

@@ -175,12 +175,10 @@
"empty": "Nessun risultato",
"excludes": "Esclude questi destinatari",
"f2b_ban_time": "Tempo di blocco (s)",
"f2b_ban_time_increment": "Tempo di blocco aumenta ad ogni blocco",
"f2b_blacklist": "Host/reti in blacklist",
"f2b_filter": "Filtri Regex",
"f2b_list_info": "Un host oppure una rete in blacklist, avrà sempre un peso maggiore rispetto ad una in whitelist. <b>L'aggiornamento della lista richiede alcuni secondi per la sua entrata in azione.</b>",
"f2b_max_attempts": "Tentativi massimi",
"f2b_max_ban_time": "Tempo massimo di blocco (s)",
"f2b_netban_ipv4": "IPv4 subnet size to apply ban on (8-32)",
"f2b_netban_ipv6": "IPv6 subnet size to apply ban on (8-128)",
"f2b_parameters": "Parametri Fail2ban",
@@ -213,7 +211,7 @@
"loading": "Caricamento in corso...",
"login_time": "Ora di accesso",
"logo_info": "La tua immagine verrà ridimensionata a 40px di altezza, quando verrà usata nella barra di navigazione in alto, ed ad una larghezza massima di 250px nella schermata iniziale. È altamente consigliato l'utilizzo di un'immagine modulabile.",
"lookup_mx": "Destination is a regular expression to match against MX name (<code>.*\\.google\\.com</code> to route all mail targeted to a MX ending in google.com over this hop)",
"lookup_mx": "Destination is a regular expression to match against MX name (<code>.*google\\.com</code> to route all mail targeted to a MX ending in google.com over this hop)",
"main_name": "Nome \"mailcow UI\"",
"merged_vars_hint": "Greyed out rows were merged from <code>vars.(local.)inc.php</code> and cannot be modified.",
"message": "Messaggio",
@@ -554,7 +552,7 @@
"hostname": "Hostname",
"inactive": "Inattivo",
"kind": "Genere",
"lookup_mx": "Destination is a regular expression to match against MX name (<code>.*\\.google\\.com</code> to route all mail targeted to a MX ending in google.com over this hop)",
"lookup_mx": "Destination is a regular expression to match against MX name (<code>.*google\\.com</code> to route all mail targeted to a MX ending in google.com over this hop)",
"mailbox": "Modifica casella di posta",
"mailbox_quota_def": "Default mailbox quota",
"mailbox_relayhost_info": "Applied to the mailbox and direct aliases only, does override a domain relayhost.",

View File

@@ -168,12 +168,10 @@
"empty": "Geen resultaten",
"excludes": "Exclusief",
"f2b_ban_time": "Verbanningstijd (s)",
"f2b_ban_time_increment": "Verbanningstijd wordt verhoogd met elk verbanning",
"f2b_blacklist": "Netwerken/hosts op de blacklist",
"f2b_filter": "Regex-filters",
"f2b_list_info": "Een host of netwerk op de blacklist staat altijd boven eenzelfde op de whitelist. <b>Het doorvoeren van wijzigingen kan enkele seconden in beslag nemen.</b>",
"f2b_max_attempts": "Maximaal aantal pogingen",
"f2b_max_ban_time": "Maximaal verbanningstijd (s)",
"f2b_netban_ipv4": "Voer de IPv4-subnetgrootte in waar de verbanning van kracht moet zijn (8-32)",
"f2b_netban_ipv6": "Voer de IPv6-subnetgrootte in waar de verbanning van kracht moet zijn (8-128)",
"f2b_parameters": "Fail2ban",

View File

@@ -1,8 +1,7 @@
{
"acl": {
"sogo_profile_reset": "Usuń profil SOGo (webmail)",
"syncjobs": "Polecenie synchronizacji",
"alias_domains": "Dodaj aliasy domen"
"syncjobs": "Polecenie synchronizacji"
},
"add": {
"active": "Aktywny",

View File

@@ -539,7 +539,7 @@
"inactive": "Inactiv",
"kind": "Fel",
"last_modified": "Ultima modificare",
"lookup_mx": "Destinația este o expresie regulată care potrivită cu numele MX (<code>.*\\.google\\.com</code> pentru a direcționa toate e-mailurile vizate către un MX care se termină în google.com peste acest hop)",
"lookup_mx": "Destinația este o expresie regulată care potrivită cu numele MX (<code>.*google\\.com</code> pentru a direcționa toate e-mailurile vizate către un MX care se termină în google.com peste acest hop)",
"mailbox": "Editează căsuța poștală",
"mailbox_quota_def": "Cota implicită a căsuței poștale",
"mailbox_relayhost_info": "Aplicat numai căsuței poștale și aliasurilor directe, suprascrie un transport dependent de domeniu.",

View File

@@ -336,9 +336,7 @@
"validate_license_now": "Получить лицензию на основе GUID с сервера лицензий",
"verify": "Проверить",
"yes": "&#10003;",
"queue_unban": "разблокировать",
"f2b_ban_time_increment": "Время бана увеличивается с каждым баном",
"f2b_max_ban_time": "Максимальное время блокировки"
"queue_unban": "разблокировать"
},
"danger": {
"access_denied": "Доступ запрещён, или указаны неверные данные",

View File

@@ -213,7 +213,7 @@
"loading": "Čakajte prosím ...",
"login_time": "Čas prihlásenia",
"logo_info": "Váš obrázok bude upravený na výšku 40px pre vrchný navigačný riadok a na maximálnu šírku 250px pre úvodnú stránku. Odporúča sa škálovateľná grafika.",
"lookup_mx": "Cieľ je regulárny výraz ktorý sa porovnáva s MX záznamom (<code>.*\\.google\\.com</code> smeruje všetku poštu určenú pre MX ktoré sú cieľom pre google.com cez tento skok)",
"lookup_mx": "Cieľ je regulárny výraz ktorý sa porovnáva s MX záznamom (<code>.*google\\.com</code> smeruje všetku poštu určenú pre MX ktoré sú cieľom pre google.com cez tento skok)",
"main_name": "\"mailcow UI\" názov",
"merged_vars_hint": "Sivé riadky boli načítané z <code>vars.(local.)inc.php</code> a nemôžu byť modifikované cez UI.",
"message": "Správa",
@@ -539,7 +539,7 @@
"inactive": "Neaktívny",
"kind": "Druh",
"last_modified": "Naposledy upravené",
"lookup_mx": "Cieľ je regulárny výraz ktorý sa zhoduje s MX záznamom (<code>.*\\.google\\.com</code> smeruje všetku poštu na MX ktoré sú cieľom pre google.com cez tento skok)",
"lookup_mx": "Cieľ je regulárny výraz ktorý sa zhoduje s MX záznamom (<code>.*google\\.com</code> smeruje všetku poštu na MX ktoré sú cieľom pre google.com cez tento skok)",
"mailbox": "Upraviť mailovú schránku",
"mailbox_quota_def": "Predvolená veľkosť mailovej schránky",
"mailbox_relayhost_info": "Aplikované len na používateľské schránky a priame aliasy, prepisuje doménového preposielateľa.",

View File

@@ -213,7 +213,7 @@
"loading": "请等待...",
"login_time": "登录时间",
"logo_info": "你的图片将会在顶部导航栏被缩放为 40px 高,在起始页被缩放为最大 250px 高。强烈推荐使用能较好适应缩放的图片。",
"lookup_mx": "应当为一个正则表达式,用于匹配 MX 记录 (例如 <code>.*\\.google\\.com</code> 将转发所有拥有以 google.com 结尾的 MX 记录的邮件)",
"lookup_mx": "应当为一个正则表达式,用于匹配 MX 记录 (例如 <code>.*google\\.com</code> 将转发所有拥有以 google.com 结尾的 MX 记录的邮件)",
"main_name": "Mailcow UI 的名称",
"merged_vars_hint": "灰色行来自 <code>vars.(local.)inc.php</code> 文件并且无法修改。",
"message": "消息",
@@ -544,7 +544,7 @@
"hostname": "主机名",
"inactive": "禁用",
"kind": "类型",
"lookup_mx": "应当为一个正则表达式,用于匹配 MX 记录 (例如 <code>.*\\.google\\.com</code> 将转发所有拥有以 google.com 结尾的 MX 记录的邮件)",
"lookup_mx": "应当为一个正则表达式,用于匹配 MX 记录 (例如 <code>.*google\\.com</code> 将转发所有拥有以 google.com 结尾的 MX 记录的邮件)",
"mailbox": "编辑邮箱",
"mailbox_quota_def": "邮箱默认配额",
"mailbox_relayhost_info": "只适用于邮箱和邮箱别名,不会覆盖域名的中继主机。",

View File

@@ -213,7 +213,7 @@
"loading": "請稍等...",
"login_time": "登入時間",
"logo_info": "你的起始頁面圖片會在頂部導覽列的限制下被縮放為 40px 高,以及最大 250px 高度。強烈推薦使用能較好縮放的圖片。",
"lookup_mx": "目的地是可以用來匹配 MX 紀錄的正規表達式 (<code>.*\\.google\\.com</code> 會將所有 MX 結尾於 google.com 的郵件轉發到此主機。)",
"lookup_mx": "目的地是可以用來匹配 MX 紀錄的正規表達式 (<code>.*google\\.com</code> 會將所有 MX 結尾於 google.com 的郵件轉發到此主機。)",
"main_name": "\"mailcow UI\" 名稱",
"merged_vars_hint": "灰色列來自 <code>vars.(local.)inc.php</code> 並且不能修改。",
"message": "訊息",
@@ -540,7 +540,7 @@
"inactive": "停用",
"kind": "種類",
"last_modified": "上次修改時間",
"lookup_mx": "目的地是可以用來匹配 MX 紀錄的正規表達式 (<code>.*\\.google\\.com</code> 會將所有 MX 結尾於 google.com 的郵件轉發到此主機。)",
"lookup_mx": "目的地是可以用來匹配 MX 紀錄的正規表達式 (<code>.*google\\.com</code> 會將所有 MX 結尾於 google.com 的郵件轉發到此主機。)",
"mailbox": "編輯信箱",
"mailbox_quota_def": "預設信箱容量配額",
"mailbox_relayhost_info": "只會套用於信箱和直接別名,不會覆寫域名中繼主機。",

View File

@@ -39,19 +39,10 @@ if (isset($_SERVER['PHP_AUTH_USER'])) {
elseif (isset($_GET['login'])) {
// load prerequisites only when required
require_once $_SERVER['DOCUMENT_ROOT'] . '/inc/prerequisites.inc.php';
$login = html_entity_decode(rawurldecode($_GET["login"]));
if (!empty($_GET['sso_token'])) {
$login = mailbox_sso('check', $_GET['sso_token']);
if ($login !== false) {
$_SESSION['mailcow_cc_username'] = $login;
$_SESSION['mailcow_cc_role'] = 'user';
}
}
// check if dual_login is active
$is_dual = (!empty($_SESSION["dual-login"]["username"])) ? true : false;
// check permissions (if dual_login is active, deny sso when acl is not given)
$login = html_entity_decode(rawurldecode($_GET["login"]));
if (isset($_SESSION['mailcow_cc_role']) &&
(($_SESSION['acl']['login_as'] == "1" && $ALLOW_ADMIN_EMAIL_LOGIN !== 0) || ($is_dual === false && $login == $_SESSION['mailcow_cc_username']))) {
if (filter_var($login, FILTER_VALIDATE_EMAIL)) {
@@ -69,7 +60,7 @@ elseif (isset($_GET['login'])) {
':remote_addr' => ($_SERVER['HTTP_X_REAL_IP'] ?? $_SERVER['REMOTE_ADDR'])
));
// redirect to sogo (sogo will get the correct credentials via nginx auth_request
header("Location: /SOGo/so/{$login}");
header("Location: /SOGo/so/${login}");
exit;
}
}

View File

@@ -12,14 +12,6 @@
<label for="f2b_ban_time">{{ lang.admin.f2b_ban_time }}:</label>
<input type="number" class="form-control" id="f2b_ban_time" name="ban_time" value="{{ f2b_data.ban_time }}" required>
</div>
<div class="mb-4">
<label for="f2b_max_ban_time">{{ lang.admin.f2b_max_ban_time }}:</label>
<input type="number" class="form-control" id="f2b_max_ban_time" name="max_ban_time" value="{{ f2b_data.max_ban_time }}" required>
</div>
<div class="mb-4">
<input class="form-check-input" type="checkbox" value="1" name="ban_time_increment" id="f2b_ban_time_increment" {% if f2b_data.ban_time_increment == 1 %}checked{% endif %}>
<label class="form-check-label" for="f2b_ban_time_increment">{{ lang.admin.f2b_ban_time_increment }}</label>
</div>
<div class="mb-4">
<label for="f2b_max_attempts">{{ lang.admin.f2b_max_attempts }}:</label>
<input type="number" class="form-control" id="f2b_max_attempts" name="max_attempts" value="{{ f2b_data.max_attempts }}" required>

View File

@@ -49,12 +49,6 @@
<p><b>{{ hostname }}</b></p>
</div></td>
</tr>
<tr>
<td>{{ lang.debug.architecture }}</td>
<td class="text-break"><div>
<p id="host_architecture">-</p>
</div></td>
</tr>
<tr>
<td>IPs</td>
<td class="text-break">
@@ -76,7 +70,7 @@
<td>Version</td>
<td class="text-break">
<div class="fw-bolder">
<p ><a href="#" id="mailcow_version">{{ mailcow_info.version_tag }}</a></p>
<p ><a href="#" id="maiclow_version">{{ mailcow_info.version_tag }}</a></p>
<p id="mailcow_update"></p>
</div>
</td>
@@ -618,7 +612,7 @@
<li class="table_collapse_option"><a class="dropdown-item" data-datatables-expand="rl_log" data-table="rl_log" href="#">{{ lang.datatables.expand_all }}</a></li>
<li class="table_collapse_option"><a class="dropdown-item" data-datatables-collapse="rl_log" data-table="rl_log" href="#">{{ lang.datatables.collapse_all }}</a></li>
</ul>
<p class="text-muted">{{ lang.admin.hash_remove_info|raw }}</p>
<p class="text-muted">{{ lang.admin.hash_remove_info }}</p>
<table id="rl_log" class="table table-striped dt-responsive w-100"></table>
</div>
</div>

View File

@@ -109,25 +109,25 @@
<label class="control-label col-sm-2">{{ lang.user.quarantine_notification }}</label>
<div class="col-sm-10">
<div class="btn-group" data-acl="{{ acl.quarantine_notification }}">
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-light{% if quarantine_notification == 'never' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-secondary{% if quarantine_notification == 'never' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailbox }}"
data-id="quarantine_notification"
data-api-url='edit/quarantine_notification'
data-api-attr='{"quarantine_notification":"never"}'>{{ lang.user.never }}</button>
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-light{% if quarantine_notification == 'hourly' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-secondary{% if quarantine_notification == 'hourly' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailbox }}"
data-id="quarantine_notification"
data-api-url='edit/quarantine_notification'
data-api-attr='{"quarantine_notification":"hourly"}'>{{ lang.user.hourly }}</button>
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-light{% if quarantine_notification == 'daily' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-secondary{% if quarantine_notification == 'daily' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailbox }}"
data-id="quarantine_notification"
data-api-url='edit/quarantine_notification'
data-api-attr='{"quarantine_notification":"daily"}'>{{ lang.user.daily }}</button>
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-light{% if quarantine_notification == 'weekly' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-secondary{% if quarantine_notification == 'weekly' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailbox }}"
data-id="quarantine_notification"
@@ -141,19 +141,19 @@
<label class="control-label col-sm-2">{{ lang.user.quarantine_category }}</label>
<div class="col-sm-10">
<div class="btn-group" data-acl="{{ acl.quarantine_category }}">
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-light{% if quarantine_category == 'reject' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-secondary{% if quarantine_category == 'reject' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailbox }}"
data-id="quarantine_category"
data-api-url='edit/quarantine_category'
data-api-attr='{"quarantine_category":"reject"}'>{{ lang.user.q_reject }}</button>
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-light{% if quarantine_category == 'add_header' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-secondary{% if quarantine_category == 'add_header' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailbox }}"
data-id="quarantine_category"
data-api-url='edit/quarantine_category'
data-api-attr='{"quarantine_category":"add_header"}'>{{ lang.user.q_add_header }}</button>
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-light{% if quarantine_category == 'all' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-secondary{% if quarantine_category == 'all' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailbox }}"
data-id="quarantine_category"
@@ -167,13 +167,13 @@
<label class="control-label col-sm-2" for="sender_acl">{{ lang.user.tls_policy }}</label>
<div class="col-sm-10">
<div class="btn-group" data-acl="{{ acl.tls_policy }}">
<button type="button" class="btn btn-sm btn-xs-half d-block d-sm-inline btn-light{% if get_tls_policy.tls_enforce_in == '1' %} btn-dark"{% endif %}"
<button type="button" class="btn btn-sm btn-xs-half d-block d-sm-inline btn-secondary{% if get_tls_policy.tls_enforce_in == '1' %} active"{% endif %}"
data-action="edit_selected"
data-item="{{ mailbox }}"
data-id="tls_policy"
data-api-url='edit/tls_policy'
data-api-attr='{"tls_enforce_in": {% if get_tls_policy.tls_enforce_in == '1' %}0{% else %}1{% endif %} }'>{{ lang.user.tls_enforce_in }}</button>
<button type="button" class="btn btn-sm btn-xs-half d-block d-sm-inline btn-light{% if get_tls_policy.tls_enforce_out == '1' %} btn-dark"{% endif %}"
<button type="button" class="btn btn-sm btn-xs-half d-block d-sm-inline btn-secondary{% if get_tls_policy.tls_enforce_out == '1' %} active"{% endif %}"
data-action="edit_selected"
data-item="{{ mailbox }}"
data-id="tls_policy"

View File

@@ -19,7 +19,7 @@
</li>
<li class="nav-item" role="presentation"><button class="nav-link" aria-controls="tab-resources" role="tab" data-bs-toggle="tab" data-bs-target="#tab-resources">{{ lang.mailbox.resources }}</button></li>
<li class="nav-item dropdown">
<a class="nav-link dropdown-toggle" data-bs-toggle="dropdown" href="#">{{ lang.mailbox.aliases }}</a>
<a class="nav-link dropdown-toggle" data-bs-toggle="dropdown" data-bs-target="#">{{ lang.mailbox.aliases }}</a>
<ul class="dropdown-menu">
<li role="presentation"><button class="dropdown-item" aria-selected="false" aria-controls="tab-mbox-aliases" role="tab" data-bs-toggle="tab" data-bs-target="#tab-mbox-aliases">{{ lang.mailbox.aliases }}</button></li>
<li role="presentation"><button class="dropdown-item" aria-selected="false" aria-controls="tab-domain-aliases" role="tab" data-bs-toggle="tab" data-bs-target="#tab-domain-aliases">{{ lang.mailbox.domain_aliases }}</button></li>

View File

@@ -54,7 +54,6 @@
<li class="dropdown-header">SMTP</li>
<li><a class="dropdown-item" data-action="edit_selected" data-id="mailbox" data-api-url='edit/mailbox' data-api-attr='{"smtp_access":1}' href="#">{{ lang.mailbox.activate }}</a></li>
<li><a class="dropdown-item" data-action="edit_selected" data-id="mailbox" data-api-url='edit/mailbox' data-api-attr='{"smtp_access":0}' href="#">{{ lang.mailbox.deactivate }}</a></li>
<li><hr class="dropdown-divider"></li>
<li class="dropdown-header">Sieve</li>
<li><a class="dropdown-item" data-action="edit_selected" data-id="mailbox" data-api-url='edit/mailbox' data-api-attr='{"sieve_access":1}' href="#">{{ lang.mailbox.activate }}</a></li>
<li><a class="dropdown-item" data-action="edit_selected" data-id="mailbox" data-api-url='edit/mailbox' data-api-attr='{"sieve_access":0}' href="#">{{ lang.mailbox.deactivate }}</a></li>
@@ -62,7 +61,7 @@
<a class="btn btn-sm btn-success" href="#" data-bs-toggle="modal" data-bs-target="#addMailboxModal"><i class="bi bi-plus-lg"></i> {{ lang.mailbox.add_mailbox }}</a>
</div>
<div class="btn-group d-none d-lg-flex">
<a class="btn btn-sm btn-secondary" id="toggle_multi_select_all" data-id="mailbox" href="#"><i class="bi bi-check-all"></i> {{ lang.mailbox.toggle_all }}</a>
<a class="btn btn-sm btn-secondary" id="toggle_multi_select_all" data-id="mailbox" href="#"><i class="bi bi-check-all"></i> {{ lang.mailbox.toggle_all }}</a>
<a class="btn btn-sm btn-xs-half btn-secondary dropdown-toggle" data-bs-toggle="dropdown" href="#">{{ lang.mailbox.quick_actions }}</a>
<ul class="dropdown-menu">
<li class="table_collapse_option"><a class="dropdown-item" data-datatables-expand="mailbox_table">{{ lang.datatables.expand_all }}</a></li>

View File

@@ -12,21 +12,11 @@
<li><button class="dropdown-item" role="tab" aria-selected="false" aria-controls="tab-config-f2b" data-bs-toggle="tab" data-bs-target="#tab-user-settings">{{ lang.user.mailbox_settings }}</button></li>
</ul>
</li>
{% if acl.spam_alias == 1 %}
<li class="nav-item" role="presentation"><button class="nav-link" role="tab" aria-selected="false" aria-controls="SpamAliases" role="tab" data-bs-toggle="tab" data-bs-target="#SpamAliases">{{ lang.user.spam_aliases }}</button></li>
{% endif %}
{% if acl.spam_score == 1 %}
<li class="nav-item" role="presentation"><button class="nav-link" role="tab" aria-selected="false" aria-controls="Spamfilter" role="tab" data-bs-toggle="tab" data-bs-target="#Spamfilter">{{ lang.user.spamfilter }}</button></li>
{% endif %}
{% if acl.syncjobs == 1 %}
<li class="nav-item" role="presentation"><button class="nav-link" role="tab" aria-selected="false" aria-controls="Syncjobs" role="tab" data-bs-toggle="tab" data-bs-target="#Syncjobs">{{ lang.user.sync_jobs }}</button></li>
{% endif %}
{% if acl.app_passwds == 1 %}
<li class="nav-item" role="presentation"><button class="nav-link" role="tab" aria-selected="false" aria-controls="AppPasswds" role="tab" data-bs-toggle="tab" data-bs-target="#AppPasswds">{{ lang.user.app_passwds }}</button></li>
{% endif %}
{% if acl.pushover == 1 %}
<li class="nav-item" role="presentation"><button class="nav-link" role="tab" aria-selected="false" aria-controls="Pushover" role="tab" data-bs-toggle="tab" data-bs-target="#Pushover">Pushover API</button></li>
{% endif %}
</ul>
<div class="row">
@@ -35,11 +25,11 @@
{% include 'user/tab-user-auth.twig' %}
{% include 'user/tab-user-details.twig' %}
{% include 'user/tab-user-settings.twig' %}
{% if acl.spam_alias == 1 %}{% include 'user/SpamAliases.twig' %}{% endif %}
{% if acl.spam_score == 1 %}{% include 'user/Spamfilter.twig' %}{% endif %}
{% if acl.syncjobs == 1 %}{% include 'user/Syncjobs.twig' %}{% endif %}
{% if acl.app_passwds == 1 %}{% include 'user/AppPasswds.twig' %}{% endif %}
{% if acl.pushover == 1 %}{% include 'user/Pushover.twig' %}{% endif %}
{% include 'user/SpamAliases.twig' %}
{% include 'user/Spamfilter.twig' %}
{% include 'user/Syncjobs.twig' %}
{% include 'user/AppPasswds.twig' %}
{% include 'user/Pushover.twig' %}
</div>
</div>
</div>

View File

@@ -12,19 +12,19 @@
<div class="col-sm-3 col-12 text-sm-end text-start text-xs-bold mb-4">{{ lang.user.tag_handling }}:</div>
<div class="col-sm-9 col-12">
<div class="btn-group" data-acl="{{ acl.delimiter_action }}">
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-light{% if get_tagging_options == 'subfolder' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-secondary{% if get_tagging_options == 'subfolder' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="delimiter_action"
data-api-url='edit/delimiter_action'
data-api-attr='{"tagged_mail_handler":"subfolder"}'>{{ lang.user.tag_in_subfolder }}</button>
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-light{% if get_tagging_options == 'subject' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-secondary{% if get_tagging_options == 'subject' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="delimiter_action"
data-api-url='edit/delimiter_action'
data-api-attr='{"tagged_mail_handler":"subject"}'>{{ lang.user.tag_in_subject }}</button>
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-light{% if get_tagging_options == 'none' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-secondary{% if get_tagging_options == 'none' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="delimiter_action"
@@ -40,13 +40,13 @@
<div class="col-sm-3 col-12 text-sm-end text-start text-xs-bold mb-4">{{ lang.user.tls_policy }}:</div>
<div class="col-sm-9 col-12">
<div class="btn-group" data-acl="{{ acl.tls_policy }}">
<button type="button" class="btn btn-sm btn-xs-half d-block d-sm-inline btn-light{% if get_tls_policy.tls_enforce_in == '1' %} btn-dark"{% endif %}"
<button type="button" class="btn btn-sm btn-xs-half d-block d-sm-inline btn-secondary{% if get_tls_policy.tls_enforce_in == '1' %} active"{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="tls_policy"
data-api-url='edit/tls_policy'
data-api-attr='{"tls_enforce_in": {% if get_tls_policy.tls_enforce_in == '1' %}0{% else %}1{% endif %} }'>{{ lang.user.tls_enforce_in }}</button>
<button type="button" class="btn btn-sm btn-xs-half d-block d-sm-inline btn-light{% if get_tls_policy.tls_enforce_out == '1' %} btn-dark"{% endif %}"
<button type="button" class="btn btn-sm btn-xs-half d-block d-sm-inline btn-secondary{% if get_tls_policy.tls_enforce_out == '1' %} active"{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="tls_policy"
@@ -61,25 +61,25 @@
<div class="col-sm-3 col-12 text-sm-end text-start text-xs-bold mb-4">{{ lang.user.quarantine_notification }}:</div>
<div class="col-sm-9 col-12">
<div class="btn-group" data-acl="{{ acl.quarantine_notification }}">
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-light{% if quarantine_notification == 'never' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-secondary{% if quarantine_notification == 'never' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="quarantine_notification"
data-api-url='edit/quarantine_notification'
data-api-attr='{"quarantine_notification":"never"}'>{{ lang.user.never }}</button>
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-light{% if quarantine_notification == 'hourly' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-secondary{% if quarantine_notification == 'hourly' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="quarantine_notification"
data-api-url='edit/quarantine_notification'
data-api-attr='{"quarantine_notification":"hourly"}'>{{ lang.user.hourly }}</button>
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-light{% if quarantine_notification == 'daily' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-secondary{% if quarantine_notification == 'daily' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="quarantine_notification"
data-api-url='edit/quarantine_notification'
data-api-attr='{"quarantine_notification":"daily"}'>{{ lang.user.daily }}</button>
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-light{% if quarantine_notification == 'weekly' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-quart d-block d-sm-inline btn-secondary{% if quarantine_notification == 'weekly' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="quarantine_notification"
@@ -93,19 +93,19 @@
<div class="col-sm-3 col-12 text-sm-end text-start text-xs-bold mb-4">{{ lang.user.quarantine_category }}:</div>
<div class="col-sm-9 col-12">
<div class="btn-group" data-acl="{{ acl.quarantine_category }}">
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-light{% if quarantine_category == 'reject' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-secondary{% if quarantine_category == 'reject' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="quarantine_category"
data-api-url='edit/quarantine_category'
data-api-attr='{"quarantine_category":"reject"}'>{{ lang.user.q_reject }}</button>
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-light{% if quarantine_category == 'add_header' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-secondary{% if quarantine_category == 'add_header' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="quarantine_category"
data-api-url='edit/quarantine_category'
data-api-attr='{"quarantine_category":"add_header"}'>{{ lang.user.q_add_header }}</button>
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-light{% if quarantine_category == 'all' %} btn-dark{% endif %}"
<button type="button" class="btn btn-sm btn-xs-third d-block d-sm-inline btn-secondary{% if quarantine_category == 'all' %} active{% endif %}"
data-action="edit_selected"
data-item="{{ mailcow_cc_username }}"
data-id="quarantine_category"

View File

@@ -106,7 +106,7 @@ services:
- rspamd
php-fpm-mailcow:
image: mailcow/phpfpm:1.84
image: mailcow/phpfpm:1.82
command: "php-fpm -d date.timezone=${TZ} -d expose_php=0"
depends_on:
- redis-mailcow
@@ -154,7 +154,7 @@ services:
- API_KEY_READ_ONLY=${API_KEY_READ_ONLY:-invalid}
- API_ALLOW_FROM=${API_ALLOW_FROM:-invalid}
- COMPOSE_PROJECT_NAME=${COMPOSE_PROJECT_NAME:-mailcow-dockerized}
- SKIP_SOLR=${SKIP_SOLR:-y}
- SKIP_XAPIAN=${SKIP_XAPIAN:-y}
- SKIP_CLAMD=${SKIP_CLAMD:-n}
- SKIP_SOGO=${SKIP_SOGO:-n}
- ALLOW_ADMIN_EMAIL_LOGIN=${ALLOW_ADMIN_EMAIL_LOGIN:-n}
@@ -169,7 +169,7 @@ services:
- phpfpm
sogo-mailcow:
image: mailcow/sogo:1.117
image: mailcow/sogo:1.115
environment:
- DBNAME=${DBNAME}
- DBUSER=${DBUSER}
@@ -191,7 +191,7 @@ services:
volumes:
- ./data/hooks/sogo:/hooks:Z
- ./data/conf/sogo/:/etc/sogo/:z
- ./data/web/inc/init_db.inc.php:/init_db.inc.php:z
- ./data/web/inc/init_db.inc.php:/init_db.inc.php:Z
- ./data/conf/sogo/custom-favicon.ico:/usr/lib/GNUstep/SOGo/WebServerResources/img/sogo.ico:z
- ./data/conf/sogo/custom-theme.js:/usr/lib/GNUstep/SOGo/WebServerResources/js/theme.js:z
- ./data/conf/sogo/custom-sogo.js:/usr/lib/GNUstep/SOGo/WebServerResources/js/custom-sogo.js:z
@@ -216,7 +216,7 @@ services:
- sogo
dovecot-mailcow:
image: mailcow/dovecot:1.24
image: mailcow/dovecot:1.20-xapian
depends_on:
- mysql-mailcow
dns:
@@ -250,7 +250,8 @@ services:
- ALLOW_ADMIN_EMAIL_LOGIN=${ALLOW_ADMIN_EMAIL_LOGIN:-n}
- MAILDIR_GC_TIME=${MAILDIR_GC_TIME:-7200}
- ACL_ANYONE=${ACL_ANYONE:-disallow}
- SKIP_SOLR=${SKIP_SOLR:-y}
- SKIP_XAPIAN=${SKIP_XAPIAN:-y}
- XAPIAN_HEAP=${XAPIAN_HEAP:-1024}
- MAILDIR_SUB=${MAILDIR_SUB:-}
- MASTER=${MASTER:-y}
- REDIS_SLAVEOF_IP=${REDIS_SLAVEOF_IP:-}
@@ -281,7 +282,7 @@ services:
ofelia.job-exec.dovecot_sarules.schedule: "@every 24h"
ofelia.job-exec.dovecot_sarules.command: "/bin/bash -c \"/usr/local/bin/sa-rules.sh\""
ofelia.job-exec.dovecot_fts.schedule: "@every 24h"
ofelia.job-exec.dovecot_fts.command: "/usr/bin/curl http://solr:8983/solr/dovecot-fts/update?optimize=true"
ofelia.job-exec.dovecot_fts.command: "doveadm fts optimize -A"
ofelia.job-exec.dovecot_repl_health.schedule: "@every 5m"
ofelia.job-exec.dovecot_repl_health.command: "/bin/bash -c \"/usr/local/bin/gosu vmail /usr/local/bin/repl_health.sh\""
ulimits:
@@ -425,7 +426,7 @@ services:
- acme
netfilter-mailcow:
image: mailcow/netfilter:1.52
image: mailcow/netfilter:1.51
stop_grace_period: 30s
depends_on:
- dovecot-mailcow
@@ -510,7 +511,7 @@ services:
- watchdog
dockerapi-mailcow:
image: mailcow/dockerapi:2.04
image: mailcow/dockerapi:2.01
security_opt:
- label=disable
restart: always
@@ -528,22 +529,6 @@ services:
aliases:
- dockerapi
solr-mailcow:
image: mailcow/solr:1.8.1
restart: always
volumes:
- solr-vol-1:/opt/solr/server/solr/dovecot-fts/data
ports:
- "${SOLR_PORT:-127.0.0.1:18983}:8983"
environment:
- TZ=${TZ}
- SOLR_HEAP=${SOLR_HEAP:-1024}
- SKIP_SOLR=${SKIP_SOLR:-y}
networks:
mailcow-network:
aliases:
- solr
olefy-mailcow:
image: mailcow/olefy:1.11
restart: always
@@ -599,7 +584,6 @@ services:
- netfilter-mailcow
- watchdog-mailcow
- dockerapi-mailcow
- solr-mailcow
environment:
- TZ=${TZ}
image: robbertkl/ipv6nat
@@ -631,7 +615,6 @@ volumes:
mysql-socket-vol-1:
redis-vol-1:
rspamd-vol-1:
solr-vol-1:
postfix-vol-1:
crypt-vol-1:
sogo-web-vol-1:

View File

@@ -122,23 +122,23 @@ else
fi
if [ ${MEM_TOTAL} -le "2097152" ]; then
echo "Disabling Solr on low-memory system."
SKIP_SOLR=y
echo "Disabling Xapian (full text search, build in Dovecot) on low-memory system."
SKIP_XAPIAN=y
elif [ ${MEM_TOTAL} -le "3670016" ]; then
echo "Installed memory is <= 3.5 GiB. It is recommended to disable Solr to prevent out-of-memory situations."
echo "Solr is a prone to run OOM and should be monitored. The default Solr heap size is 1024 MiB and should be set in mailcow.conf according to your expected load."
echo "Solr can be re-enabled by setting SKIP_SOLR=n in mailcow.conf but will refuse to start with less than 2 GB total memory."
read -r -p "Do you want to disable Solr now? [Y/n] " response
echo "Installed memory is <= 3.5 GiB. We suggest you to disable Xapian (full text search, build in Dovecot) to prevent out-of-memory situations."
echo "The default Xapian heap size is 1024 MiB and should be set in mailcow.conf according to your expected load."
echo "Xapian can be re-enabled by setting SKIP_XAPIAN=n in mailcow.conf but will refuse to start with less than 2 GB total memory."
read -r -p "Do you want to disable the FTS Xapian now? [Y/n] " response
case $response in
[nN][oO]|[nN])
SKIP_SOLR=n
SKIP_XAPIAN=n
;;
*)
SKIP_SOLR=y
SKIP_XAPIAN=y
;;
esac
else
SKIP_SOLR=n
SKIP_XAPIAN=n
fi
if [[ ${SKIP_BRANCH} != y ]]; then
@@ -205,8 +205,8 @@ DBUSER=mailcow
# Please use long, random alphanumeric strings (A-Za-z0-9)
DBPASS=$(LC_ALL=C </dev/urandom tr -dc A-Za-z0-9 2> /dev/null | head -c 28)
DBROOT=$(LC_ALL=C </dev/urandom tr -dc A-Za-z0-9 2> /dev/null | head -c 28)
DBPASS=$(LC_ALL=C </dev/urandom tr -dc A-Za-z0-9 | head -c 28)
DBROOT=$(LC_ALL=C </dev/urandom tr -dc A-Za-z0-9 | head -c 28)
# ------------------------------
# HTTP/S Bindings
@@ -243,7 +243,6 @@ POPS_PORT=995
SIEVE_PORT=4190
DOVEADM_PORT=127.0.0.1:19991
SQL_PORT=127.0.0.1:13306
SOLR_PORT=127.0.0.1:18983
REDIS_PORT=127.0.0.1:7654
# Your timezone
@@ -261,7 +260,7 @@ COMPOSE_PROJECT_NAME=mailcowdockerized
# Switch here between native (compose plugin) and standalone
# For more informations take a look at the mailcow docs regarding the configuration options.
# Normally this should be untouched but if you decided to use either of those you can switch it manually here.
# Please be aware that at least one of those variants should be installed on your machine or mailcow will fail.
# Please be aware that at least one of those variants should be installed on your maschine or mailcow will fail.
DOCKER_COMPOSE_VERSION=${COMPOSE_VERSION}
@@ -329,14 +328,14 @@ SKIP_CLAMD=${SKIP_CLAMD}
SKIP_SOGO=n
# Skip Solr on low-memory systems or if you do not want to store a readable index of your mails in solr-vol-1.
# Skip Xapian (FTS) on low-memory systems or if you do not want to store a readable index of your mails in vmail-index-vol-1.
SKIP_SOLR=${SKIP_SOLR}
SKIP_XAPIAN=${SKIP_XAPIAN}
# Solr heap size in MB, there is no recommendation, please see Solr docs.
# Solr is a prone to run OOM and should be monitored. Unmonitored Solr setups are not recommended.
# Xapian heap size in MB, there is no recommendation, please see Xapians docs.
# Xapian is replacing solr completely. It is supposed to be much more efficient in CPU and RAM consumption.
SOLR_HEAP=1024
XAPIAN_HEAP=1024
# Allow admins to log into SOGo as email user (without any password)

View File

@@ -26,6 +26,6 @@ services:
- /var/run/mysqld/mysqld.sock:/var/run/mysqld/mysqld.sock
mysql-mailcow:
image: alpine:3.18
image: alpine:3.17
command: /bin/true
restart: "no"

15
helper-scripts/expiry-dates.sh Executable file → Normal file
View File

@@ -3,11 +3,10 @@
[[ -f mailcow.conf ]] && source mailcow.conf
[[ -f ../mailcow.conf ]] && source ../mailcow.conf
POSTFIX=$(echo | openssl s_client -connect ${MAILCOW_HOSTNAME}:${SMTP_PORT} -starttls smtp 2>/dev/null | openssl x509 -inform pem -noout -enddate | cut -d "=" -f 2)
DOVECOT=$(echo | openssl s_client -connect ${MAILCOW_HOSTNAME}:${IMAP_PORT} -starttls imap 2>/dev/null | openssl x509 -inform pem -noout -enddate | cut -d "=" -f 2)
NGINX=$(echo | openssl s_client -connect ${MAILCOW_HOSTNAME}:${HTTPS_PORT} 2>/dev/null | openssl x509 -inform pem -noout -enddate | cut -d "=" -f 2)
echo "TLS expiry dates:"
echo "Postfix: ${POSTFIX}"
echo "Dovecot: ${DOVECOT}"
echo "Nginx: ${NGINX}"
POSTFIX=$(echo | openssl s_client -connect ${MAILCOW_HOSTNAME}:25 -starttls smtp 2>/dev/null | openssl x509 -inform pem -noout -enddate | cut -d "=" -f 2)
DOVECOT=$(echo | openssl s_client -connect ${MAILCOW_HOSTNAME}:143 -starttls imap 2>/dev/null | openssl x509 -inform pem -noout -enddate | cut -d "=" -f 2)
NGINX=$(echo | openssl s_client -connect ${MAILCOW_HOSTNAME}:443 2>/dev/null | openssl x509 -inform pem -noout -enddate | cut -d "=" -f 2)
echo TLS expiry dates:
echo Postfix: ${POSTFIX}
echo Dovecot: ${DOVECOT}
echo Nginx: ${NGINX}

View File

@@ -19,7 +19,7 @@ read -r -p "Are you sure you want to reset the mailcow administrator account? [y
response=${response,,} # tolower
if [[ "$response" =~ ^(yes|y)$ ]]; then
echo -e "\nWorking, please wait..."
random=$(</dev/urandom tr -dc _A-Z-a-z-0-9 2> /dev/null | head -c${1:-16})
random=$(</dev/urandom tr -dc _A-Z-a-z-0-9 | head -c${1:-16})
password=$(docker exec -it $(docker ps -qf name=dovecot-mailcow) doveadm pw -s SSHA256 -p ${random} | tr -d '\r')
docker exec -it $(docker ps -qf name=mysql-mailcow) mysql -u${DBUSER} -p${DBPASS} ${DBNAME} -e "DELETE FROM admin WHERE username='admin';"
docker exec -it $(docker ps -qf name=mysql-mailcow) mysql -u${DBUSER} -p${DBPASS} ${DBNAME} -e "DELETE FROM domain_admins WHERE username='admin';"

View File

@@ -1,6 +1,6 @@
#!/usr/bin/env bash
# renovate: datasource=github-releases depName=nextcloud/server versioning=semver extractVersion=^v(?<version>.*)$
NEXTCLOUD_VERSION=26.0.2
NEXTCLOUD_VERSION=25.0.3
echo -ne "Checking prerequisites..."
sleep 1
@@ -97,12 +97,8 @@ elif [[ ${NC_UPDATE} == "y" ]]; then
echo -e "\033[31mError: Nextcloud occ not found. Is Nextcloud installed?\033[0m"
exit 1
fi
if grep -Pq 'This version of Nextcloud is not compatible with (?:PHP)?(?>=?)(?:PHP)?(?>.+)' <<<$(docker exec -it -u www-data $(docker ps -f name=php-fpm-mailcow -q) bash -c "/web/nextcloud/occ --no-warnings status"); then
echo -e "\033[31mError: This version of Nextcloud is not compatible with the current PHP version of php-fpm-mailcow, we'll fix it\033[0m"
wget -q https://raw.githubusercontent.com/nextcloud/server/v26.0.0/lib/versioncheck.php -O ./data/web/nextcloud/lib/versioncheck.php
echo -e "\e[33mPlease restart the update again.\e[0m"
elif ! grep -q 'installed: true' <<<$(docker exec -it -u www-data $(docker ps -f name=php-fpm-mailcow -q) bash -c "/web/nextcloud/occ --no-warnings status"); then
echo -e "\033[31mError: Nextcloud seems not to be installed.\033[0m"
if ! grep -q 'installed: true' <<<$(docker exec -it -u www-data $(docker ps -f name=php-fpm-mailcow -q) bash -c "/web/nextcloud/occ --no-warnings status"); then
echo "Nextcloud seems not to be installed."
exit 1
else
docker exec -it -u www-data $(docker ps -f name=php-fpm-mailcow -q) bash -c "php /web/nextcloud/updater/updater.phar"
@@ -126,7 +122,7 @@ elif [[ ${NC_INSTALL} == "y" ]]; then
&& chmod +x ./data/web/nextcloud/occ
echo -e "\033[33mCreating 'nextcloud' database...\033[0m"
NC_DBPASS=$(</dev/urandom tr -dc A-Za-z0-9 2> /dev/null | head -c 28)
NC_DBPASS=$(</dev/urandom tr -dc A-Za-z0-9 | head -c 28)
NC_DBUSER=nextcloud
NC_DBNAME=nextcloud
@@ -142,7 +138,7 @@ elif [[ ${NC_INSTALL} == "y" ]]; then
echo ""
echo -e "\033[33mInstalling Nextcloud...\033[0m"
ADMIN_NC_PASS=$(</dev/urandom tr -dc A-Za-z0-9 2> /dev/null | head -c 28)
ADMIN_NC_PASS=$(</dev/urandom tr -dc A-Za-z0-9 | head -c 28)
echo -ne "[1/4] Setting correct permissions for www-data"
docker exec -it $(docker ps -f name=php-fpm-mailcow -q) /bin/bash -c "chown -R www-data:www-data /web/nextcloud"

View File

@@ -6,7 +6,7 @@ SPFTOOLS_DIR=${WORKING_DIR}/spf-tools
POSTWHITE_DIR=${WORKING_DIR}/postwhite
POSTWHITE_CONF=${POSTWHITE_DIR}/postwhite.conf
CUSTOM_HOSTS='"web.de gmx.net mail.de freenet.de arcor.de unity-mail.de"'
COSTOM_HOSTS="web.de gmx.net mail.de freenet.de arcor.de unity-mail.de"
STATIC_HOSTS=(
"194.25.134.0/24 permit # t-online.de"
)
@@ -19,23 +19,16 @@ function set_config() {
sudo sed -i "s@^\($1\s*=\s*\).*\$@\1$2@" ${POSTWHITE_CONF}
}
set_config custom_hosts "${CUSTOM_HOSTS}"
set_config custom_hosts ${COSTOM_HOSTS}
set_config reload_postfix no
set_config postfixpath /.
set_config spftoolspath ${WORKING_DIR}/spf-tools
set_config whitelist .${SCRIPT_DIR}/../data/conf/postfix/postscreen_access.cidr
set_config yahoo_static_hosts ${POSTWHITE_DIR}/yahoo_static_hosts.txt
#Fix URL for Yahoo!: https://github.com/stevejenkins/postwhite/issues/59
sudo sed -i \
-e 's#yahoo_url="https://help.yahoo.com/kb/SLN23997.html"#yahoo_url="https://senders.yahooinc.com/outbound-mail-servers/"#' \
-e 's#echo "ipv6:$line";#echo "ipv6:$line" | grep -v "ipv6:::";#' \
-e 's#`command -v wget`#`command -v skip-wget`#' \
${POSTWHITE_DIR}/scrape_yahoo
cd ${POSTWHITE_DIR}
./postwhite ${POSTWHITE_CONF}
( IFS=$'\n'; echo "${STATIC_HOSTS[*]}" >> "${SCRIPT_DIR}/../data/conf/postfix/postscreen_access.cidr")
rm -r ${WORKING_DIR}
rm -r ${WORKING_DIR}

103
update.sh
View File

@@ -176,19 +176,18 @@ remove_obsolete_nginx_ports() {
}
detect_docker_compose_command(){
if ! [[ "${DOCKER_COMPOSE_VERSION}" =~ ^(native|standalone)$ ]]; then
if ! [ "${DOCKER_COMPOSE_VERSION}" == "native" ] && ! [ "${DOCKER_COMPOSE_VERSION}" == "standalone" ]; then
if docker compose > /dev/null 2>&1; then
if docker compose version --short | grep "2." > /dev/null 2>&1; then
DOCKER_COMPOSE_VERSION=native
COMPOSE_COMMAND="docker compose"
echo -e "\e[31mFound Docker Compose Plugin (native).\e[0m"
echo -e "\e[31mSetting the DOCKER_COMPOSE_VERSION Variable to native\e[0m"
sed -i 's/^DOCKER_COMPOSE_VERSION=.*/DOCKER_COMPOSE_VERSION=native/' $SCRIPT_DIR/mailcow.conf
sleep 2
echo -e "\e[33mNotice: You'll have to update this Compose Version via your Package Manager manually!\e[0m"
else
echo -e "\e[31mCannot find Docker Compose with a Version Higher than 2.X.X.\e[0m"
echo -e "\e[31mPlease update/install it manually regarding to this doc site: https://docs.mailcow.email/i_u_m/i_u_m_install/\e[0m"
echo -e "\e[31mPlease update/install it manually regarding to this doc site: https://mailcow.github.io/mailcow-dockerized-docs/i_u_m/i_u_m_install/\e[0m"
exit 1
fi
elif docker-compose > /dev/null 2>&1; then
@@ -198,60 +197,26 @@ if ! [[ "${DOCKER_COMPOSE_VERSION}" =~ ^(native|standalone)$ ]]; then
COMPOSE_COMMAND="docker-compose"
echo -e "\e[31mFound Docker Compose Standalone.\e[0m"
echo -e "\e[31mSetting the DOCKER_COMPOSE_VERSION Variable to standalone\e[0m"
sed -i 's/^DOCKER_COMPOSE_VERSION=.*/DOCKER_COMPOSE_VERSION=standalone/' $SCRIPT_DIR/mailcow.conf
sleep 2
echo -e "\e[33mNotice: For an automatic update of docker-compose please use the update_compose.sh scripts located at the helper-scripts folder.\e[0m"
else
echo -e "\e[31mCannot find Docker Compose with a Version Higher than 2.X.X.\e[0m"
echo -e "\e[31mPlease update/install regarding to this doc site: https://docs.mailcow.email/i_u_m/i_u_m_install/\e[0m"
echo -e "\e[31mPlease update/install regarding to this doc site: https://mailcow.github.io/mailcow-dockerized-docs/i_u_m/i_u_m_install/\e[0m"
exit 1
fi
fi
else
echo -e "\e[31mCannot find Docker Compose.\e[0m"
echo -e "\e[31mPlease install it regarding to this doc site: https://docs.mailcow.email/i_u_m/i_u_m_install/\e[0m"
echo -e "\e[31mPlease install it regarding to this doc site: https://mailcow.github.io/mailcow-dockerized-docs/i_u_m/i_u_m_install/\e[0m"
exit 1
fi
elif [ "${DOCKER_COMPOSE_VERSION}" == "native" ]; then
COMPOSE_COMMAND="docker compose"
# Check if Native Compose works and has not been deleted
if ! $COMPOSE_COMMAND > /dev/null 2>&1; then
# IF it not exists/work anymore try the other command
COMPOSE_COMMAND="docker-compose"
if ! $COMPOSE_COMMAND > /dev/null 2>&1 || ! $COMPOSE_COMMAND --version | grep "^2." > /dev/null 2>&1; then
# IF it cannot find Standalone in > 2.X, then script stops
echo -e "\e[31mCannot find Docker Compose or the Version is lower then 2.X.X.\e[0m"
echo -e "\e[31mPlease install it regarding to this doc site: https://docs.mailcow.email/i_u_m/i_u_m_install/\e[0m"
exit 1
fi
# If it finds the standalone Plugin it will use this instead and change the mailcow.conf Variable accordingly
echo -e "\e[31mFound different Docker Compose Version then declared in mailcow.conf!\e[0m"
echo -e "\e[31mSetting the DOCKER_COMPOSE_VERSION Variable from native to standalone\e[0m"
sed -i 's/^DOCKER_COMPOSE_VERSION=.*/DOCKER_COMPOSE_VERSION=standalone/' $SCRIPT_DIR/mailcow.conf
sleep 2
fi
elif [ "${DOCKER_COMPOSE_VERSION}" == "standalone" ]; then
COMPOSE_COMMAND="docker-compose"
# Check if Standalone Compose works and has not been deleted
if ! $COMPOSE_COMMAND > /dev/null 2>&1 && ! $COMPOSE_COMMAND --version > /dev/null 2>&1 | grep "^2." > /dev/null 2>&1; then
# IF it not exists/work anymore try the other command
COMPOSE_COMMAND="docker compose"
if ! $COMPOSE_COMMAND > /dev/null 2>&1; then
# IF it cannot find Native in > 2.X, then script stops
echo -e "\e[31mCannot find Docker Compose.\e[0m"
echo -e "\e[31mPlease install it regarding to this doc site: https://docs.mailcow.email/i_u_m/i_u_m_install/\e[0m"
exit 1
fi
# If it finds the native Plugin it will use this instead and change the mailcow.conf Variable accordingly
echo -e "\e[31mFound different Docker Compose Version then declared in mailcow.conf!\e[0m"
echo -e "\e[31mSetting the DOCKER_COMPOSE_VERSION Variable from standalone to native\e[0m"
sed -i 's/^DOCKER_COMPOSE_VERSION=.*/DOCKER_COMPOSE_VERSION=native/' $SCRIPT_DIR/mailcow.conf
sleep 2
fi
fi
}
@@ -361,12 +326,8 @@ while (($#)); do
echo -e "\e[32mRunning in forced mode...\e[0m"
FORCE=y
;;
-d|--dev)
echo -e "\e[32mRunning in Developer mode...\e[0m"
DEV=y
;;
--help|-h)
echo './update.sh [-c|--check, --ours, --gc, --nightly, --prefetch, --skip-start, --skip-ping-check, --stable, -f|--force, -d|--dev, -h|--help]
echo './update.sh [-c|--check, --ours, --gc, --nightly, --prefetch, --skip-start, --skip-ping-check, --stable, -f|--force, -h|--help]
-c|--check - Check for updates and exit (exit codes => 0: update available, 3: no updates)
--ours - Use merge strategy option "ours" to solve conflicts in favor of non-mailcow code (local changes over remote changes), not recommended!
@@ -377,7 +338,6 @@ while (($#)); do
--skip-ping-check - Skip ICMP Check to public DNS resolvers (Use it only if you´ve blocked any ICMP Connections to your mailcow machine)
--stable - Switch your mailcow updates to the stable (master) branch. Default unless you changed it with --nightly.
-f|--force - Force update, do not ask questions
-d|--dev - Enables Developer Mode (No Checkout of update.sh for tests)
'
exit 1
esac
@@ -428,8 +388,8 @@ CONFIG_ARRAY=(
"MAILDIR_GC_TIME"
"MAILDIR_SUB"
"ACL_ANYONE"
"SOLR_HEAP"
"SKIP_SOLR"
"XAPIAN_HEAP"
"SKIP_XAPIAN"
"ENABLE_SSL_SNI"
"ALLOW_ADMIN_EMAIL_LOGIN"
"SKIP_HTTP_VERIFICATION"
@@ -550,20 +510,20 @@ for option in ${CONFIG_ARRAY[@]}; do
echo '# Otherwise a user might share data with too many other users.' >> mailcow.conf
echo 'ACL_ANYONE=disallow' >> mailcow.conf
fi
elif [[ ${option} == "SOLR_HEAP" ]]; then
elif [[ ${option} == "XAPIAN_HEAP" ]]; then
if ! grep -q ${option} mailcow.conf; then
echo "Adding new option \"${option}\" to mailcow.conf"
echo '# Solr heap size, there is no recommendation, please see Solr docs.' >> mailcow.conf
echo '# Solr is a prone to run OOM on large systems and should be monitored. Unmonitored Solr setups are not recommended.' >> mailcow.conf
echo '# Solr will refuse to start with total system memory below or equal to 2 GB.' >> mailcow.conf
echo "SOLR_HEAP=1024" >> mailcow.conf
echo "Replacing SOLR_HEAP with \"${option}\" in mailcow.conf"
sed -i '/# Solr heap size in MB, there is no recommendation, please see Solr docs./c\# Xapian heap size in MB, there is no recommendation, please see Xapians docs.' mailcow.conf
sed -i '/# Solr is a prone to run OOM on large systems and should be monitored. Unmonitored Solr setups are not recommended./c\# Xapian is replacing solr completely. It is supposed to be much more efficient in CPU and RAM consumption.' mailcow.conf
sed -i '/SOLR_HEAP=/c\XAPIAN_HEAP=' mailcow.conf
fi
elif [[ ${option} == "SKIP_SOLR" ]]; then
elif [[ ${option} == "SKIP_XAPIAN" ]]; then
if ! grep -q ${option} mailcow.conf; then
echo "Adding new option \"${option}\" to mailcow.conf"
echo '# Solr is disabled by default after upgrading from non-Solr to Solr-enabled mailcows.' >> mailcow.conf
echo '# Disable Solr or if you do not want to store a readable index of your mails in solr-vol-1.' >> mailcow.conf
echo "SKIP_SOLR=y" >> mailcow.conf
echo "Replacing SKIP_SOLR with \"${option}\" in mailcow.conf"
sed -i '/# Skip Solr on low-memory systems or if you do not want to store a readable index of your mails in solr-vol-1./c\# Skip Xapian (FTS) on low-memory systems or if you do not want to store a readable index of your mails in vmail-index-vol-1.' mailcow.conf
sed -i '/SKIP_SOLR=/c\SKIP_XAPIAN=' mailcow.conf
echo "Removing Solr-Port Binding from mailcow.conf"
sed -i '/SOLR_PORT=/d' mailcow.conf
fi
elif [[ ${option} == "ENABLE_SSL_SNI" ]]; then
if ! grep -q ${option} mailcow.conf; then
@@ -637,7 +597,7 @@ for option in ${CONFIG_ARRAY[@]}; do
echo "Adding new option \"${option}\" to mailcow.conf"
echo '# Password hash algorithm' >> mailcow.conf
echo '# Only certain password hash algorithm are supported. For a fully list of supported schemes,' >> mailcow.conf
echo '# see https://docs.mailcow.email/models/model-passwd/' >> mailcow.conf
echo '# see https://mailcow.github.io/mailcow-dockerized-docs/models/model-passwd/' >> mailcow.conf
echo "MAILCOW_PASS_SCHEME=BLF-CRYPT" >> mailcow.conf
fi
elif [[ ${option} == "ADDITIONAL_SERVER_NAMES" ]]; then
@@ -657,7 +617,7 @@ for option in ${CONFIG_ARRAY[@]}; do
echo '# Optional: Leave empty for none' >> mailcow.conf
echo '# This value is only used on first order!' >> mailcow.conf
echo '# Setting it at a later point will require the following steps:' >> mailcow.conf
echo '# https://docs.mailcow.email/troubleshooting/debug-reset_tls/' >> mailcow.conf
echo '# https://mailcow.github.io/mailcow-dockerized-docs/troubleshooting/debug-reset_tls/' >> mailcow.conf
echo 'ACME_CONTACT=' >> mailcow.conf
fi
elif [[ ${option} == "WEBAUTHN_ONLY_TRUSTED_VENDORS" ]]; then
@@ -767,17 +727,15 @@ elif [ $NEW_BRANCH == "nightly" ] && [ $CURRENT_BRANCH != "nightly" ]; then
git checkout -f ${BRANCH}
fi
if [ ! $DEV ]; then
echo -e "\e[32mChecking for newer update script...\e[0m"
SHA1_1=$(sha1sum update.sh)
git fetch origin #${BRANCH}
git checkout origin/${BRANCH} update.sh
SHA1_2=$(sha1sum update.sh)
if [[ ${SHA1_1} != ${SHA1_2} ]]; then
echo "update.sh changed, please run this script again, exiting."
chmod +x update.sh
exit 2
fi
echo -e "\e[32mChecking for newer update script...\e[0m"
SHA1_1=$(sha1sum update.sh)
git fetch origin #${BRANCH}
git checkout origin/${BRANCH} update.sh
SHA1_2=$(sha1sum update.sh)
if [[ ${SHA1_1} != ${SHA1_2} ]]; then
echo "update.sh changed, please run this script again, exiting."
chmod +x update.sh
exit 2
fi
if [ ! $FORCE ]; then
@@ -944,6 +902,9 @@ else
echo -e "\e[33mCannot determine current git repository version...\e[0m"
fi
# Set DOCKER_COMPOSE_VERSION
sed -i 's/^DOCKER_COMPOSE_VERSION=$/DOCKER_COMPOSE_VERSION='$DOCKER_COMPOSE_VERSION'/g' mailcow.conf
if [[ ${SKIP_START} == "y" ]]; then
echo -e "\e[33mNot starting mailcow, please run \"$COMPOSE_COMMAND up -d --remove-orphans\" to start mailcow.\e[0m"
else