Started by upstream project "Trigger" build number 258 originally caused by: Started by timer Running as SYSTEM Building remotely on testintegration (swarm rocky9 ice36 java11) in workspace /home/omero/workspace/OMERO-test-integration [OMERO-test-integration] $ /bin/bash -xe /tmp/jenkins53968946370306487.sh + rm -rf /home/omero/workspace/OMERO-test-integration/.venv3 + python3 -m venv /home/omero/workspace/OMERO-test-integration/.venv3 + source /home/omero/workspace/OMERO-test-integration/.venv3/bin/activate ++ deactivate nondestructive ++ '[' -n '' ']' ++ '[' -n '' ']' ++ '[' -n /bin/bash -o -n '' ']' ++ hash -r ++ '[' -n '' ']' ++ unset VIRTUAL_ENV ++ '[' '!' nondestructive = nondestructive ']' ++ VIRTUAL_ENV=/home/omero/workspace/OMERO-test-integration/.venv3 ++ export VIRTUAL_ENV ++ _OLD_VIRTUAL_PATH=/opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ++ PATH=/home/omero/workspace/OMERO-test-integration/.venv3/bin:/opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ++ export PATH ++ '[' -n '' ']' ++ '[' -z '' ']' ++ _OLD_VIRTUAL_PS1= ++ PS1='(.venv3) ' ++ export PS1 ++ '[' -n /bin/bash -o -n '' ']' ++ hash -r + pip install https://github.com/glencoesoftware/zeroc-ice-py-rhel9-x86_64/releases/download/20230830/zeroc_ice-3.6.5-cp39-cp39-linux_x86_64.whl Collecting zeroc-ice==3.6.5 Downloading https://github.com/glencoesoftware/zeroc-ice-py-rhel9-x86_64/releases/download/20230830/zeroc_ice-3.6.5-cp39-cp39-linux_x86_64.whl (31.3 MB) Installing collected packages: zeroc-ice Successfully installed zeroc-ice-3.6.5 + pip install -U pip future setuptools Requirement already satisfied: pip in ./.venv3/lib/python3.9/site-packages (21.2.3) Collecting pip Using cached pip-24.2-py3-none-any.whl (1.8 MB) Collecting future Using cached future-1.0.0-py3-none-any.whl (491 kB) Requirement already satisfied: setuptools in ./.venv3/lib/python3.9/site-packages (53.0.0) Collecting setuptools Using cached setuptools-75.2.0-py3-none-any.whl (1.2 MB) Installing collected packages: setuptools, pip, future Attempting uninstall: setuptools Found existing installation: setuptools 53.0.0 Uninstalling setuptools-53.0.0: Successfully uninstalled setuptools-53.0.0 Attempting uninstall: pip Found existing installation: pip 21.2.3 Uninstalling pip-21.2.3: Successfully uninstalled pip-21.2.3 Successfully installed future-1.0.0 pip-24.2 setuptools-75.2.0 + pip install markdown Collecting markdown Using cached Markdown-3.7-py3-none-any.whl.metadata (7.0 kB) Collecting importlib-metadata>=4.4 (from markdown) Using cached importlib_metadata-8.5.0-py3-none-any.whl.metadata (4.8 kB) Collecting zipp>=3.20 (from importlib-metadata>=4.4->markdown) Using cached zipp-3.20.2-py3-none-any.whl.metadata (3.7 kB) Using cached Markdown-3.7-py3-none-any.whl (106 kB) Using cached importlib_metadata-8.5.0-py3-none-any.whl (26 kB) Using cached zipp-3.20.2-py3-none-any.whl (9.2 kB) Installing collected packages: zipp, importlib-metadata, markdown Successfully installed importlib-metadata-8.5.0 markdown-3.7 zipp-3.20.2 + pip install mox3 pytest pytest-django pytest-xdist pytest-mock Collecting mox3 Using cached mox3-1.1.0-py3-none-any.whl.metadata (3.2 kB) Collecting pytest Using cached pytest-8.3.3-py3-none-any.whl.metadata (7.5 kB) Collecting pytest-django Using cached pytest_django-4.9.0-py3-none-any.whl.metadata (8.2 kB) Collecting pytest-xdist Using cached pytest_xdist-3.6.1-py3-none-any.whl.metadata (4.3 kB) Collecting pytest-mock Using cached pytest_mock-3.14.0-py3-none-any.whl.metadata (3.8 kB) Collecting pbr!=2.1.0,>=2.0.0 (from mox3) Using cached pbr-6.1.0-py2.py3-none-any.whl.metadata (3.4 kB) Collecting fixtures>=3.0.0 (from mox3) Using cached fixtures-4.1.0-py3-none-any.whl.metadata (21 kB) Collecting iniconfig (from pytest) Using cached iniconfig-2.0.0-py3-none-any.whl.metadata (2.6 kB) Collecting packaging (from pytest) Using cached packaging-24.1-py3-none-any.whl.metadata (3.2 kB) Collecting pluggy<2,>=1.5 (from pytest) Using cached pluggy-1.5.0-py3-none-any.whl.metadata (4.8 kB) Collecting exceptiongroup>=1.0.0rc8 (from pytest) Using cached exceptiongroup-1.2.2-py3-none-any.whl.metadata (6.6 kB) Collecting tomli>=1 (from pytest) Using cached tomli-2.0.2-py3-none-any.whl.metadata (10.0 kB) Collecting execnet>=2.1 (from pytest-xdist) Using cached execnet-2.1.1-py3-none-any.whl.metadata (2.9 kB) Using cached mox3-1.1.0-py3-none-any.whl (43 kB) Using cached pytest-8.3.3-py3-none-any.whl (342 kB) Using cached pytest_django-4.9.0-py3-none-any.whl (23 kB) Using cached pytest_xdist-3.6.1-py3-none-any.whl (46 kB) Using cached pytest_mock-3.14.0-py3-none-any.whl (9.9 kB) Using cached exceptiongroup-1.2.2-py3-none-any.whl (16 kB) Using cached execnet-2.1.1-py3-none-any.whl (40 kB) Using cached fixtures-4.1.0-py3-none-any.whl (64 kB) Using cached pbr-6.1.0-py2.py3-none-any.whl (108 kB) Using cached pluggy-1.5.0-py3-none-any.whl (20 kB) Using cached tomli-2.0.2-py3-none-any.whl (13 kB) Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB) Using cached packaging-24.1-py3-none-any.whl (53 kB) Installing collected packages: tomli, pluggy, pbr, packaging, iniconfig, execnet, exceptiongroup, pytest, fixtures, pytest-xdist, pytest-mock, pytest-django, mox3 Successfully installed exceptiongroup-1.2.2 execnet-2.1.1 fixtures-4.1.0 iniconfig-2.0.0 mox3-1.1.0 packaging-24.1 pbr-6.1.0 pluggy-1.5.0 pytest-8.3.3 pytest-django-4.9.0 pytest-mock-3.14.0 pytest-xdist-3.6.1 tomli-2.0.2 + pip install tables Collecting tables Using cached tables-3.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.3 kB) Collecting numpy>=1.19.0 (from tables) Using cached numpy-2.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (60 kB) Collecting numexpr>=2.6.2 (from tables) Using cached numexpr-2.10.1-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.metadata (1.2 kB) Requirement already satisfied: packaging in ./.venv3/lib/python3.9/site-packages (from tables) (24.1) Collecting py-cpuinfo (from tables) Using cached py_cpuinfo-9.0.0-py3-none-any.whl.metadata (794 bytes) Collecting blosc2>=2.3.0 (from tables) Using cached blosc2-2.5.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (9.2 kB) Collecting ndindex>=1.4 (from blosc2>=2.3.0->tables) Using cached ndindex-1.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.4 kB) Collecting msgpack (from blosc2>=2.3.0->tables) Using cached msgpack-1.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (8.4 kB) Using cached tables-3.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.5 MB) Using cached blosc2-2.5.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.4 MB) Using cached numexpr-2.10.1-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (404 kB) Using cached numpy-2.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (19.5 MB) Using cached py_cpuinfo-9.0.0-py3-none-any.whl (22 kB) Using cached ndindex-1.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (478 kB) Using cached msgpack-1.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (377 kB) Installing collected packages: py-cpuinfo, numpy, ndindex, msgpack, numexpr, blosc2, tables Successfully installed blosc2-2.5.1 msgpack-1.1.0 ndindex-1.9.2 numexpr-2.10.1 numpy-2.0.2 py-cpuinfo-9.0.0 tables-3.9.2 + pip install jinja2 Collecting jinja2 Using cached jinja2-3.1.4-py3-none-any.whl.metadata (2.6 kB) Collecting MarkupSafe>=2.0 (from jinja2) Using cached MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.0 kB) Using cached jinja2-3.1.4-py3-none-any.whl (133 kB) Using cached MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (20 kB) Installing collected packages: MarkupSafe, jinja2 Successfully installed MarkupSafe-3.0.2 jinja2-3.1.4 + pip install PyYAML Collecting PyYAML Using cached PyYAML-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB) Using cached PyYAML-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (737 kB) Installing collected packages: PyYAML Successfully installed PyYAML-6.0.2 + pip install omero-py omero-web Collecting omero-py Using cached omero_py-5.19.5-py3-none-any.whl.metadata (6.3 kB) Collecting omero-web Using cached omero_web-5.27.2-py3-none-any.whl.metadata (6.4 kB) Collecting urllib3<2 (from omero-py) Using cached urllib3-1.26.20-py2.py3-none-any.whl.metadata (50 kB) Collecting appdirs (from omero-py) Using cached appdirs-1.4.4-py2.py3-none-any.whl.metadata (9.0 kB) Requirement already satisfied: future in ./.venv3/lib/python3.9/site-packages (from omero-py) (1.0.0) Collecting numpy<2 (from omero-py) Using cached numpy-1.26.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (61 kB) Collecting Pillow>=10.0.0 (from omero-py) Using cached pillow-11.0.0-cp39-cp39-manylinux_2_28_x86_64.whl.metadata (9.1 kB) Requirement already satisfied: PyYAML in ./.venv3/lib/python3.9/site-packages (from omero-py) (6.0.2) Requirement already satisfied: zeroc-ice<3.7,>=3.6.5 in ./.venv3/lib/python3.9/site-packages (from omero-py) (3.6.5) Collecting requests (from omero-py) Using cached requests-2.32.3-py3-none-any.whl.metadata (4.6 kB) Collecting portalocker (from omero-py) Using cached portalocker-2.10.1-py3-none-any.whl.metadata (8.5 kB) Collecting concurrent-log-handler>=0.9.20 (from omero-web) Using cached concurrent_log_handler-0.9.25-py3-none-any.whl.metadata (17 kB) Collecting Django<4.3,>=4.2.3 (from omero-web) Using cached Django-4.2.16-py3-none-any.whl.metadata (4.1 kB) Collecting django-pipeline==2.1.0 (from omero-web) Using cached django_pipeline-2.1.0-py3-none-any.whl.metadata (12 kB) Collecting django-cors-headers==3.7.0 (from omero-web) Using cached django_cors_headers-3.7.0-py3-none-any.whl.metadata (15 kB) Collecting whitenoise>=5.3.0 (from omero-web) Using cached whitenoise-6.7.0-py3-none-any.whl.metadata (3.7 kB) Collecting gunicorn>=19.3 (from omero-web) Using cached gunicorn-23.0.0-py3-none-any.whl.metadata (4.4 kB) Collecting omero-marshal>=0.7.0 (from omero-web) Using cached omero_marshal-0.9.0-py3-none-any.whl.metadata (2.2 kB) Collecting pytz (from omero-web) Using cached pytz-2024.2-py2.py3-none-any.whl.metadata (22 kB) Requirement already satisfied: packaging in ./.venv3/lib/python3.9/site-packages (from omero-web) (24.1) Collecting asgiref<4,>=3.6.0 (from Django<4.3,>=4.2.3->omero-web) Using cached asgiref-3.8.1-py3-none-any.whl.metadata (9.3 kB) Collecting sqlparse>=0.3.1 (from Django<4.3,>=4.2.3->omero-web) Using cached sqlparse-0.5.1-py3-none-any.whl.metadata (3.9 kB) Collecting charset-normalizer<4,>=2 (from requests->omero-py) Using cached charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (34 kB) Collecting idna<4,>=2.5 (from requests->omero-py) Using cached idna-3.10-py3-none-any.whl.metadata (10 kB) Collecting certifi>=2017.4.17 (from requests->omero-py) Using cached certifi-2024.8.30-py3-none-any.whl.metadata (2.2 kB) Collecting typing-extensions>=4 (from asgiref<4,>=3.6.0->Django<4.3,>=4.2.3->omero-web) Using cached typing_extensions-4.12.2-py3-none-any.whl.metadata (3.0 kB) Using cached omero_py-5.19.5-py3-none-any.whl (2.8 MB) Using cached omero_web-5.27.2-py3-none-any.whl (2.8 MB) Using cached django_cors_headers-3.7.0-py3-none-any.whl (12 kB) Using cached django_pipeline-2.1.0-py3-none-any.whl (38 kB) Using cached concurrent_log_handler-0.9.25-py3-none-any.whl (25 kB) Using cached Django-4.2.16-py3-none-any.whl (8.0 MB) Using cached gunicorn-23.0.0-py3-none-any.whl (85 kB) Using cached numpy-1.26.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB) Using cached omero_marshal-0.9.0-py3-none-any.whl (88 kB) Using cached pillow-11.0.0-cp39-cp39-manylinux_2_28_x86_64.whl (4.4 MB) Using cached portalocker-2.10.1-py3-none-any.whl (18 kB) Using cached urllib3-1.26.20-py2.py3-none-any.whl (144 kB) Using cached whitenoise-6.7.0-py3-none-any.whl (19 kB) Using cached appdirs-1.4.4-py2.py3-none-any.whl (9.6 kB) Using cached pytz-2024.2-py2.py3-none-any.whl (508 kB) Using cached requests-2.32.3-py3-none-any.whl (64 kB) Using cached asgiref-3.8.1-py3-none-any.whl (23 kB) Using cached certifi-2024.8.30-py3-none-any.whl (167 kB) Using cached charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (144 kB) Using cached idna-3.10-py3-none-any.whl (70 kB) Using cached sqlparse-0.5.1-py3-none-any.whl (44 kB) Using cached typing_extensions-4.12.2-py3-none-any.whl (37 kB) Installing collected packages: pytz, django-pipeline, appdirs, whitenoise, urllib3, typing-extensions, sqlparse, portalocker, Pillow, omero-marshal, numpy, idna, gunicorn, charset-normalizer, certifi, requests, concurrent-log-handler, asgiref, omero-py, Django, django-cors-headers, omero-web Attempting uninstall: numpy Found existing installation: numpy 2.0.2 Uninstalling numpy-2.0.2: Successfully uninstalled numpy-2.0.2 Successfully installed Django-4.2.16 Pillow-11.0.0 appdirs-1.4.4 asgiref-3.8.1 certifi-2024.8.30 charset-normalizer-3.4.0 concurrent-log-handler-0.9.25 django-cors-headers-3.7.0 django-pipeline-2.1.0 gunicorn-23.0.0 idna-3.10 numpy-1.26.4 omero-marshal-0.9.0 omero-py-5.19.5 omero-web-5.27.2 portalocker-2.10.1 pytz-2024.2 requests-2.32.3 sqlparse-0.5.1 typing-extensions-4.12.2 urllib3-1.26.20 whitenoise-6.7.0 [OMERO-test-integration] $ /bin/bash -xe /tmp/jenkins21541291586645220.sh + source /home/settings.env ++ OMERO_DB_HOST=pg ++ OMERO_DB_USER=postgres ++ OMERO_DB_PASS= ++ OMERO_ROOT_PASS=omero ++ OMERO_DATA_DIR=/OMERO ++ export OMERO_DB_USER OMERO_DB_PASS OMERO_ROOT_PASS OMERO_DATA_DIR + OMERO_DB_NAME=OMERO-test-integration + SRC=/home/omero/workspace/OMERO-test-integration/src + OMERO_DIST=/home/omero/workspace/OMERO-test-integration/src/dist + export OMERODIR=/home/omero/workspace/OMERO-test-integration/src/dist + OMERODIR=/home/omero/workspace/OMERO-test-integration/src/dist + '[' -e /home/omero/workspace/OMERO-test-integration/src/dist ']' + source /home/omero/workspace/OMERO-test-integration/.venv3/bin/activate ++ deactivate nondestructive ++ '[' -n '' ']' ++ '[' -n '' ']' ++ '[' -n /bin/bash -o -n '' ']' ++ hash -r ++ '[' -n '' ']' ++ unset VIRTUAL_ENV ++ '[' '!' nondestructive = nondestructive ']' ++ VIRTUAL_ENV=/home/omero/workspace/OMERO-test-integration/.venv3 ++ export VIRTUAL_ENV ++ _OLD_VIRTUAL_PATH=/opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ++ PATH=/home/omero/workspace/OMERO-test-integration/.venv3/bin:/opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ++ export PATH ++ '[' -n '' ']' ++ '[' -z '' ']' ++ _OLD_VIRTUAL_PS1= ++ PS1='(.venv3) ' ++ export PS1 ++ '[' -n /bin/bash -o -n '' ']' ++ hash -r + omero admin stop Waiting on shutdown. Use CTRL-C to exit .+ deactivate + '[' -n /opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ']' + PATH=/opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin + export PATH + unset _OLD_VIRTUAL_PATH + '[' -n '' ']' + '[' -n /bin/bash -o -n '' ']' + hash -r + '[' -n '' ']' + unset VIRTUAL_ENV + '[' '!' '' = nondestructive ']' + unset -f deactivate + sleep 5 + dropdb -f -h pg -U postgres OMERO-test-integration + rm -rf /home/omero/workspace/OMERO-test-integration/config /home/omero/workspace/OMERO-test-integration/data /home/omero/workspace/OMERO-test-integration/omero-blitz-5.7.5-SNAPSHOT-python.zip /home/omero/workspace/OMERO-test-integration/omero_dropbox-5.7.1.dev0.tar.gz /home/omero/workspace/OMERO-test-integration/omero_marshal-0.9.1.dev0.tar.gz /home/omero/workspace/OMERO-test-integration/omero_py-5.19.6.dev0.tar.gz /home/omero/workspace/OMERO-test-integration/omero_scripts-5.8.4.dev0.tar.gz /home/omero/workspace/OMERO-test-integration/omero_web-5.27.3.dev0.tar.gz /home/omero/workspace/OMERO-test-integration/src /home/omero/workspace/OMERO-test-integration/version.properties Copied 1 artifact from "OMERO-build" build number 214 Copied 5 artifacts from "OMERO-python-superbuild-build" build number 211 Copied 2 artifacts from "OMERO-build-build" build number 222 [OMERO-test-integration] $ /bin/bash -xe /tmp/jenkins13744323007072805623.sh + source /home/settings.env ++ OMERO_DB_HOST=pg ++ OMERO_DB_USER=postgres ++ OMERO_DB_PASS= ++ OMERO_ROOT_PASS=omero ++ OMERO_DATA_DIR=/OMERO ++ export OMERO_DB_USER OMERO_DB_PASS OMERO_ROOT_PASS OMERO_DATA_DIR + export ZIP_FILE=/home/omero/workspace/OMERO-test-integration/omero-blitz-VERSION-python.zip + ZIP_FILE=/home/omero/workspace/OMERO-test-integration/omero-blitz-VERSION-python.zip + export VERSION_PROPERTIES=/home/omero/workspace/OMERO-test-integration/version.properties + VERSION_PROPERTIES=/home/omero/workspace/OMERO-test-integration/version.properties + OMERO_DB_NAME=OMERO-test-integration + OMERO_DATA_DIR=/home/omero/workspace/OMERO-test-integration/data + mkdir -p /home/omero/workspace/OMERO-test-integration/data + SRC=/home/omero/workspace/OMERO-test-integration/src ++ ls openmicroscopy-5.6.3-513-75ed6e6d79-ice36.zip + ZIP_SRC=openmicroscopy-5.6.3-513-75ed6e6d79-ice36.zip + unzip openmicroscopy-5.6.3-513-75ed6e6d79-ice36.zip Archive: openmicroscopy-5.6.3-513-75ed6e6d79-ice36.zip creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/.classpath-template creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/.github/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/.github/Pull_Request_Template.md creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/.github/workflows/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/.github/workflows/docker_build.yml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/.github/workflows/release.yml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/.github/workflows/source_build.yml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/.github/workflows/update.yaml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/.mailmap inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/.project inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/CMakeLists.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/CONTRIBUTING.md inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/Dockerfile inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/LICENSE.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/README.md inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/SUPPORT.md inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/build.bat inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/build.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/build.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/.classpath inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/.project creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/resources/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/resources/excludebugs.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/resources/global.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/resources/hibernate.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/resources/lifecycle.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/resources/version.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/scripts/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/scripts/parse_version inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/scripts/source-archive.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/build.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/README.md inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/build.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/plugins/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/plugins/robot.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/resources/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/resources/robot.template creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/resources/web/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/resources/web/annotation.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/resources/web/login.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/resources/web/thumbs.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/resources/web/tree.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/resources/web/webadmin.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/resources/well_sample_posXY.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/robot_setup.sh creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/annotate_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/center_right_panel_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/create_scenario.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/delete_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/download_options_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/filter_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/forms_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/history_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/map_annotations_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/post_copy_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/post_cut_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/post_drag_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/post_manage_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/rdef_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/search.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/search_annotation_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/show_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/spw_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/tagging_test.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/view_image.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/webadmin_create_group_and_user.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/webadmin_login.robot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tests/ui/testcases/web/webadmin_user_settings.robot creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/CMakeLists.txt creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/cmake/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/cmake/CompilerChecks.cmake inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/cmake/FindIce.cmake inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/cmake/GTest.cmake inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/cmake/TemplateCmdConfig.cmake.in inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/cmake/TemplateConfig.cmake.in inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/cmake/TemplateShellConfig.cmake.in inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/cmake/Version.cmake creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/CHANGES inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/CMakeLists.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/CONTRIBUTORS inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/LICENSE inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/README creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/cmake/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/cmake/internal_utils.cmake creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest-death-test.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest-message.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest-param-test.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest-param-test.h.pump inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest-printers.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest-spi.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest-test-part.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest-typed-test.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest_pred_impl.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/gtest_prod.h creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-death-test-internal.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-filepath.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-internal.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-linked_ptr.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-param-util-generated.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-param-util-generated.h.pump inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-param-util.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-port.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-string.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-tuple.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-tuple.h.pump inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-type-util.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/include/gtest/internal/gtest-type-util.h.pump creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/prime_tables.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample1.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample1.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample10_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample1_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample2.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample2.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample2_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample3-inl.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample3_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample4.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample4.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample4_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample5_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample6_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample7_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample8_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/samples/sample9_unittest.cc creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/scripts/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/scripts/fuse_gtest_files.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/scripts/gen_gtest_pred_impl.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/scripts/gtest-config.in inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/scripts/pump.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/scripts/test/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/scripts/test/Makefile creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/gtest-all.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/gtest-death-test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/gtest-filepath.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/gtest-internal-inl.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/gtest-port.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/gtest-printers.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/gtest-test-part.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/gtest-typed-test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/gtest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/src/gtest_main.cc creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-death-test_ex_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-death-test_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-filepath_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-linked_ptr_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-listener_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-message_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-options_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-param-test2_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-param-test_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-param-test_test.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-port_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-printers_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-test-part_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-tuple_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-typed-test2_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-typed-test_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-typed-test_test.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest-unittest-api_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_all_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_break_on_failure_unittest.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_break_on_failure_unittest_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_catch_exceptions_test.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_catch_exceptions_test_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_color_test.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_color_test_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_env_var_test.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_env_var_test_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_environment_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_filter_unittest.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_filter_unittest_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_help_test.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_help_test_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_list_tests_unittest.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_list_tests_unittest_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_main_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_no_test_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_output_test.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_output_test_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_output_test_golden_lin.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_pred_impl_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_premature_exit_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_prod_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_repeat_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_shuffle_test.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_shuffle_test_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_sole_header_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_stress_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_test_utils.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_throw_on_failure_ex_test.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_throw_on_failure_test.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_throw_on_failure_test_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_uninitialized_test.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_uninitialized_test_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_unittest.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_xml_outfile1_test_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_xml_outfile2_test_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_xml_outfiles_test.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_xml_output_unittest.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_xml_output_unittest_.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/gtest_xml_test_utils.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/production.cc inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/ext/gtest-1.7.0/test/production.h creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/README.h creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/ClientErrors.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/ClientErrors.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/IceNoWarnPop.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/IceNoWarnPush.h extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/IcePortPop.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/IcePortPush.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/ObjectFactory.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/ObjectFactory.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/ObjectFactoryRegistrar.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/ObjectFactoryRegistrar.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/RTypesI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/RTypesI.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/all.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/callbacks.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/callbacks.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/client.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/client.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/clientF.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/clientF.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/conversions.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/conversions.h creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/internal/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/internal/fixes.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/min.h creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/DetailsI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/DetailsI.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/ElectricPotentialI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/ElectricPotentialI.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/FrequencyI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/FrequencyI.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/LengthI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/LengthI.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/ObjectFactory.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/PermissionsI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/PermissionsI.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/PowerI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/PowerI.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/PressureI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/PressureI.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/TemperatureI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/TemperatureI.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/TimeI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/model/TimeI.h creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/sys/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/sys/ParametersI.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/sys/ParametersI.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/templates.h creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/util/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/util/concurrency.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/util/concurrency.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/util/tiles.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/util/tiles.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/util/uuid.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/src/omero/util/uuid.h creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/CMakeLists.txt creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/CMakeLists.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/ClientUsageTest.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/admin.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/annotations.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/api.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/beta3.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/chgrp.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/cmdcallbacktest.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/counts.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/delete.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/guest.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/mapannotation.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/model51.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/permissions.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/rendering_settings_test.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/search.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/integration/sessions.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/main.cpp creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/omero/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/omero/CMakeLists.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/omero/fixture.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/omero/fixture.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/safegtest.h creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/CMakeLists.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/client.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/concurrency.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/conversions.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/model.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/operators.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/parameters.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/perms.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/rtypes.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/test/unit/units.cpp creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/utils/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/utils/CMakeLists.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/utils/chgrp.cpp creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/build.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/ivy.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/setup.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/test/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/test/drivers.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/test/integration/ extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/test/integration/__init__.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/test/integration/test_dbclient.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/test/records/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/test/records/first.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroFS/test/records/outofsync.txt creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/.classpath-template inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/.project inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/build.gradle inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/build.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/ivy.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/resources/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/settings.gradle inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/fs.testng.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/gateway.testng.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration.testng.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/AbstractServerImportTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/AbstractServerTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/AbstractTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/AdminServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/AgentTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/BfPixelBufferTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/BlockSizeTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ClientUsageTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/CmdCallbackTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ConfigurationServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/CreatePojosFixture2.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/DeleteServiceFilesTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/DeleteServicePermissionsTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/DeleteServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/DiskUsageTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/DuplicationTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ExporterTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ExtendedAnnotationTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/FindParentsChildrenTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ImportAsTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ImportLibraryTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ImporterTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/LegalGraphTargetsTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/LightAdminPrivilegesTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/LightAdminRolesTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/MapAnnotationTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/MetadataServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ModelMockFactory.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/PermissionsTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/PermissionsTestAll.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/PixelsServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/PojosServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ProjectionServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/PyramidMinMaxTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/RawFileStoreTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/RawPixelsStoreTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/RenderingEngineTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/RenderingSettingsServicePermissionsTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/RenderingSettingsServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/RepositoryServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ResetPasswordTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/RoiServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/RolesTests.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/ScriptServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/TableTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/UpdateServiceTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/UpgradeCheckTest.java creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/AnnotationMoveTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/HierarchyMoveAndPermissionsTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/HierarchyMoveCombinedDataTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/HierarchyMoveDatasetTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/HierarchyMoveImageWithAcquisitionDataTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/HierarchyMoveImageWithRoiFromOtherUserTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/HierarchyMoveImageWithRoiTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/HierarchyMoveTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/MultiImageFilesetMoveTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chgrp/RenderingSettingsMoveTest.java creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chmod/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chmod/PermissionsTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chmod/RolesTest.java creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chown/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/chown/PermissionsTest.java creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/delete/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/delete/AdditionalDeleteTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/delete/AnnotationDeleteTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/delete/DeleteProjectedImageTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/delete/HierarchyDeleteTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/delete/MultiImageFilesetDeleteTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/delete/RelatedToTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/delete/RoiDeleteTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/delete/SpwDeleteTest.java creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/AdminFacilityTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/BrowseFacilityTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/DataManagerFacilityTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/GatewayTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/GatewayUsageTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/LoadFacilityTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/MetadataFacilityTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/ROIFacilityTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/RawDataFacilityTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/SearchFacilityTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/gateway/TablesFacilityTest.java creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/thumbnail/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/thumbnail/BatchLoadingTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/thumbnail/SingleFileTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/thumbnail/SkipThumbnailsPermissionsTest.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroJava/test/integration/thumbnail/Utils.java creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/bin/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/bin/omero inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/bin/omero.bat inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/bin/setpythonpath.bat inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/bin/winconfig.bat inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/build.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/examples/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/examples/createSession.py extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/examples/load.omero inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/examples/projection_1.py extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/examples/submit.omero inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/examples/upload.omero inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/ice.config inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/ivy.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/manualtests/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/manualtests/README.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/manualtests/populate_roi_test.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/setup.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/ extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/__init__.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/conftest.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/ extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/__init__.py extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/client_ctors.cfg creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/ extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/__init__.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/cli.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_admin.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_chgrp.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_chown.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_cleanse.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_db.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_delete.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_download.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_duplicate.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_fs.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_group.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_import.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_import_bulk.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_ldap.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_metadata.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_obj.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_pyramids.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_script.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_search.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_sessions.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_tag.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_upload.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/clitest/test_user.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/fstest/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/fstest/test_rename.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/__init__.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_annotation.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_chgrp.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_chmod.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_chown.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_config_service.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_connection.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_delete.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_fs.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_get_objects.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_helpers.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_image.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_image_wrapper.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_missing_pyramid.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_multi_group.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_performance.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_permissions.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_pixels.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_plate_wrapper.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_rdefs.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_search_objects.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_services.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_ticket10618.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_user.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/gatewaytest/test_wrapper.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/helpers.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/metadata/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/metadata/bulk_to_map_annotation_context.yml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/metadata/bulk_to_map_annotation_context_ns.csv inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/metadata/bulk_to_map_annotation_context_ns.yml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/metadata/bulk_to_map_annotation_context_ns2.yml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/metadata/bulk_to_map_annotation_context_ns2_empty.yml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/metadata/bulk_to_map_annotation_context_ns2_fail.yml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/metadata/test_metadata_mapannotations.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/metadata/test_populate.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/metadata/test_pydict_text.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptsharness/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptsharness/definition.cfg inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptsharness/definition.py extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptsharness/simple_script.cfg inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptsharness/simple_script.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptsharness/test_harness.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/ extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/__init__.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/script.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/test_cli.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/test_coverage.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/test_inputs.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/test_ping.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/test_rand.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/test_repo.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/test_roi_handling_utils.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/scriptstest/test_script_utils.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/tablestest/ extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/tablestest/service-reference-dev_4_4_5.h5.bz2 extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/tablestest/service-reference-dev_5_3_4.h5.bz2 inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/tablestest/test_backwards_compatibility.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/tablestest/test_populate_metadata.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/tablestest/test_service.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/tablestest/test_tables.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_admin.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_annotation.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_annotationPermissions.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_chgrp.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_chmod.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_chown.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_client_ctors.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_clientusage.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_cmdcallback.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_counts.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_delete.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_exporter.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_files.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_iconfig.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_icontainer.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_ildap.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_imetadata.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_iquery.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_isession.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_ishare.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_itimeline.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_itypes.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_iupdate.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_librarytest.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_mail.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_mapannotation.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_metadatastore.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_model51.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_permissions.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_pixelsService.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_rawfilestore.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_rawpixelsstore.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_reimport.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_render.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_reporawfilestore.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_repository.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_rois.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_scripts.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_search.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_simple.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_thumbnailPerms.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_thumbs.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_tickets1000.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_tickets2000.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_tickets3000.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_tickets4000.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_tickets6000.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroPy/test/integration/test_util.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/README.rst inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/build.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/setup.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/conftest.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_annotate.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_api_containers.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_api_errors.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_api_experimenters_groups.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_api_images.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_api_login.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_api_projects.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_api_rois.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_api_wells.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_chgrp.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_chown.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_config.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_containers.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_csrf.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_decorators.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_download.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_groups_users.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_histogram.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_history.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_links.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_login.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_marshal.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_metadata.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_plategrid.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_rendering.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_scripts.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_show.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_simple.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_table.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_tags.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_thumbnails.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_tree.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_tree_annotations.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroWeb/test/integration/test_webadmin.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/build.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/bump_version.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/common.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/pytest.ini inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/python.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/travis-build creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/headers.txt creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/hudson/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/hudson/OMERO.sh inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/hudson/README.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/hudson/functions.sh creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/images/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/images/block-diagram.svg creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/images/build/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/images/build/build_files.dot inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/images/build/build_files.png inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/images/build/build_files.svg inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/images/build/newbuild.dot creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/install/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/install/VM/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/install/VM/README.md inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/install/python_deps.sh inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/overview.html inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/package-template.html creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sequencediagrams/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sequencediagrams/Sessions.ods creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/Makefile inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/build.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/conf.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/index.rst inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/rules.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/sphinx.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/themes/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/themes/api_theme/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/themes/api_theme/layout.html creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/themes/api_theme/static/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/themes/api_theme/static/api_style.css_t inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/sphinx-api/themes/api_theme/theme.conf creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/styles/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/styles/CodeTemplate.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/styles/CodeTemplate.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/styles/OMERO_Eclipse_Code_Style.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/styles/README.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/styles/checkstyle.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/docs/styles/headers.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/backup.cfg creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/blitz/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/blitz/mail-senders.example inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/blitz/mail-server.example inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/build.properties inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/env.bat inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/env.sh inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/hibernate.properties inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/ice.config inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/ivysettings.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/logback-cli.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/logback-indexing-cli.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/logback-indexing.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/logback.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/mime.types inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/node1.cfg inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/node2.cfg inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/omero.properties creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/profiles/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/profiles/psql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/rollover.cfg creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/Windows.cfg creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/error/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/error/maintainance.html creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/grid/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/grid/README inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/grid/default.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/grid/templates.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/grid/windefault.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/ice.config inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/internal.cfg inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/templates/master.cfg inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/etc/testdropbox.config creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/CMakeLists.txt creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/CMakeLists.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/Callback.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/Callback.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/Callback.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/FileAnnotationDelete.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/FileAnnotationDelete.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/Options.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/Options.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/Options.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/Subclass.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/Subclass.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Delete/Subclass.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/CMakeLists.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/clientpointer.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/collectionmethods.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/collectionmethods.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/collectionmethods.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/collectionmethods.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/configuration.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/configuration.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/configuration.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/configuration.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/constants.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/constants.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/constants.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/constants.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/constructors.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/constructors.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/constructors.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/constructors.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/details.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/details.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/details.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/details.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/enumerations.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/enumerations.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/enumerations.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/enumerations.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/getsetattr.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/interfaces.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/interfaces.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/interfaces.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/iterators.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/iterators.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/iterators.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/iterators.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/lists.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/lists.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/primitives.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/primitives.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/primitives.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/primitives.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/queries.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/queries.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/queries.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/queries.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/rcollection.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/rcollection.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/rcollection.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/rcollection.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/smartpointers.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/staticfields.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/staticfields.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/staticfields.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/staticfields.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/sudo.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/sudo.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/sudo.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/sudo.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/timeout.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/timeout.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/unloaded.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/unloaded.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/unloaded.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/unloaded.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/updates.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/updates.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/updates.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroClients/updates.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroTables/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroTables/FindMeasurements.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroTables/MeasurementTable.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroTables/first.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/OmeroTables/iroi.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/README.txt creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/RegionsOfInterest/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/RegionsOfInterest/CMakeLists.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/RegionsOfInterest/Main.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/RegionsOfInterest/Main.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/RegionsOfInterest/Main.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/RegionsOfInterest/SConscript inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/SConstruct creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScreenPlateWell/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScreenPlateWell/imagesperwell.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScreenPlateWell/imagesperwell.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScreenPlateWell/listplates.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScreenPlateWell/listplates.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScreenPlateWell/platebybarcode.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScreenPlateWell/platecodebyimage.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScreenPlateWell/platecodebyimage.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScriptingService/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScriptingService/Edit_Descriptions.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScriptingService/HelloWorld.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScriptingService/NativeWrapperExample.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScriptingService/Notifications.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScriptingService/Notifications.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScriptingService/adminWorkflow.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/ScriptingService/runHelloWorld.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/.classpath inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/.project creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/CreateImage.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/DeleteData.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/HowToUseTables.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/LoadMetadataAdvanced.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/ROIFolders.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/ROIs.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/RawDataAccess.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/ReadData.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/ReadDataAdvanced.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/RenderImages.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/Setup.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/Sudo.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/java/src/training/WriteData.java creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/javascript/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/javascript/index.html inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/javascript/utils.js inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/markup.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/ConnectToOMERO.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/CreateImage.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/DeleteData.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/Filesets.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/LoadMetadataAdvanced.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/ROIs.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/RawDataAccess.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/ReadData.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/ReadDataAdvanced.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/RenderImages.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/WriteData.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/exampleSuite.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/matlab/parseOmeroProperties.m creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Advanced/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Advanced/Create_Image_advanced.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Advanced/Raw_Data_advanced.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Advanced/Read_Data_advanced.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Advanced/Write_data_advanced.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Bulk_Shapes.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Connect_To_OMERO.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Create_Image.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Delete.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Filesets.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Groups_Permissions.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Json_Api/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Json_Api/Login.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Metadata.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Parse_OMERO_Properties.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/ROIs.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Raw_Data_Access.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Read_Data.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Render_Images.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Scripting_Service_Example.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Tables.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Task_Scripts/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Task_Scripts/ROIs_To_Table.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Task_Scripts/Raw_Data2.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Task_Scripts/Raw_Data_Task.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Task_Scripts/Write_Data_3.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Task_Scripts/Write_Data_4.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/Write_Data.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/python/__main__.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/Training/training_setup.sh creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/AllProjects.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/AllProjects.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/AllProjects.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/AllProjects.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/CMakeLists.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/Main.cpp inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/Main.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/Main.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/Main.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/PrintProjects.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/PrintProjects.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/PrintProjects.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/PrintProjects.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/SConscript inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/Usage.h inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/Usage.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/Usage.m inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/examples/TreeList/Usage.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/history.rst inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/ivy.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/IVY1016/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/IVY1016/EclipseClasspath.class inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/IVY1016/EclipseClasspath.java creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/LICENSE-ASM.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/LICENSE-AppleJavaExtensions.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/LICENSE-bcel.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/LICENSE-docbook.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/LICENSE-dom4j.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/LICENSE-jcip.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/LICENSE.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/README.txt creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/annotations.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/asm-3.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/asm-analysis-3.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/asm-commons-3.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/asm-tree-3.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/asm-util-3.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/asm-xml-3.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/bcel.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/buggy.icns inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/dom4j-full.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/lib/findbugs.jar creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/plugin/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/plugin/coreplugin.jar creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/src/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/src/xsl/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/src/xsl/default.xsl inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/src/xsl/fancy-hist.xsl inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/src/xsl/fancy.xsl inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/src/xsl/plain.xsl inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/findbugs-1.2.1/src/xsl/summary.xsl inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/keystore creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/README inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/agpl.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/antlr.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/apache-v1.1.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/apache-v2.0.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/asm.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/bouncy_castle_license.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/bsd.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/cddl-v1.0.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/cpl-v10.html inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/crystalclear.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/django.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/dom4j.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/epl-v1.0.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/fbsd.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/freemarker.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/gpl-v2.0.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/gpl.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/gtest.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/iconsweets_license.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/jamon.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/jmock.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/jquery.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/jtidy.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/lgpl-3.0.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/lgpl-v2.0.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/lgpl-v2.1.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/lgpl-v3.0.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/lgpl.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/matplotlib.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/mit.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/mpl-v1_0.htm inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/npl-1_1.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/path.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/pil.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/postgresql.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/psf-v2.0.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/psf-v2.5.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/scons.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/slf4j.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/tablelayout.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/which.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/zlib-libpng.txt inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/licenses/zpl-v2.1.txt creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/linux-native/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/linux-native/libnativewindow_jvm.so creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/linux64-native/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/linux64-native/libnativewindow_jvm.so inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/logback-build.xml creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/patches/ extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/patches/hibernate-3.5-ColumnTransformer.patch.gz creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/JHotDraw-7.0.9.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/TableLayout-bin-jdk1.5-2009-08-26.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/ant-1.8.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/ant-contrib-1.0b3.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/ant-junit-1.8.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/ant-launcher-1.8.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/ant-nodeps-1.8.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/ant-trax-1.8.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/batik-all-1.8pre-jdk6.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/bufr-1.1.00.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/checkstyle-4.3.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/commons-validator-1.3.1.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/findbugs-ant-1.2.1.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/geronimo-spec-jta-1.0.1B-rc4.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/grib-5.1.03.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/hibernate-core-3.5.6-4510.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/hibernate-jpa-2.0-api-1.0.0.Final.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/hibernate-search-3.1.1.GA.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/hibernate-tools-3.2.0.beta11.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/hibernate-validator-3.1.0.GA.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/ipython-1.2.1.tar.gz inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/ivy-2.4.0.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/omero-icemock-3.5.1.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/omero-icemock-3.6.3.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/omero.java inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/oro-2.0.8.jar inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/lib/repository/scons-local-2.1.0.zip creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/3-3-2_rgb.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/fire.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/glasbey.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/glasbey_inverted.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/glow.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/grays.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ice.lut creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/janelia/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/janelia/pup_br.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/janelia/pup_nr.lut creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/16_colors.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/5_ramps.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/6_shades.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/blue_orange_icb.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/brgbcmyw.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/cool.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/edges.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/gem.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/phase.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/royal.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/sepia.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/smart.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/thal.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/thallium.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/ncsa_paledit/unionjack.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/physics.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/red-green.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/spectrum.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/thermal.lut creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/cyan_hot.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/green_fire_blue.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/hilo.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/ica.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/ica2.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/ica3.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/magenta_hot.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/orange_hot.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/rainbow_rgb.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/red_hot.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/luts/wcif/yellow_hot.lut inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/omero.class inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/setup.cfg creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/README.txt creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/misc/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/misc/Makefile inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/misc/current.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/misc/enums.py creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/ creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__1/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__1/OMERO3__7.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__10/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__10/OMERO3A__9.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__11/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__11/OMERO3A__10.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__11/OMERO3A__5.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__2/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__2/OMERO3A__1.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__3/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__3/OMERO3A__2.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__3/OMERO3__5.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__4/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__4/OMERO3A__3.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__4/OMERO3__5.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__5/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__5/OMERO3A__4.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__6/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__6/OMERO3A__5.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__7/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__7/OMERO3A__6.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__8/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__8/OMERO3A__6.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__9/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3A__9/OMERO3A__8.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3__4/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3__4/OMERO3__1.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3__5/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3__5/OMERO3__4.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3__6/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3__6/OMERO3__5.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3__7/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO3__7/OMERO3__6.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.1__0/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.1__0/OMERO4__0.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.2__0/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.2__0/OMERO4.1__0.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.2__0/t/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.2__0/t/OMERO4.1__0.py inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.2__0/t/OMERO4.1__0.test creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.3__0/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.3__0/OMERO4.2__0.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.3__0/omero-4.2-data-fix.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.4__0/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.4__0/OMERO4.1__0.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4.4__0/OMERO4.3__0.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4__0/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO4__0/OMERO3A__11.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.0__0/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.0__0/2014-SV2-empty-passwords-fix.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.0__0/2014-SV2-empty-passwords-list.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.0__0/OMERO4.4__0.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.1__1/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.1__1/OMERO5.0__0.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.1__1/OMERO5.1-delete-and-disable-shares.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.1__1/OMERO5.1-enable-shares.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.1__1/OMERO5.1__0.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.2__0/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.2__0/OMERO5.1__1.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.2__0/OMERO5.2-delete-and-disable-shares.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.2__0/OMERO5.2-enable-shares.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.3__0/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.3__0/OMERO5.2__0-precheck.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.3__0/OMERO5.2__0.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.3__0/OMERO5.3-index-files.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.3__0/delete-ns-orphans.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.3__0/reverse_shape_color_argb_to_rgba.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.3__0/shape_color_argb_to_rgba.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4DEV__1/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4DEV__1/OMERO5.4DEV__0.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4DEV__2/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4DEV__2/OMERO5.4DEV__1.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4DEV__3/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4DEV__3/OMERO5.4DEV__2.sql creating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/ inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/OMERO5.3__0-precheck.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/OMERO5.3__0.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/OMERO5.3__1-precheck.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/OMERO5.3__1.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/OMERO5.4DEV__3.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/allow-guest-user-without-password.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/psql-footer.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/psql-header.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/schema.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/sql/psql/OMERO5.4__0/views.sql inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/test.xml inflating: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/update_dependencies.sh extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/antlib/resources/gitversion.xml extracting: openmicroscopy-5.6.3-513-75ed6e6d79-ice36/components/tools/OmeroCpp/cmake/GitVersion.cmake + rm -f openmicroscopy-5.6.3-513-75ed6e6d79-ice36.zip + mv /home/omero/workspace/OMERO-test-integration/openmicroscopy-5.6.3-513-75ed6e6d79-ice36 /home/omero/workspace/OMERO-test-integration/src + source /home/omero/workspace/OMERO-test-integration/.venv3/bin/activate ++ deactivate nondestructive ++ '[' -n '' ']' ++ '[' -n '' ']' ++ '[' -n /bin/bash -o -n '' ']' ++ hash -r ++ '[' -n '' ']' ++ unset VIRTUAL_ENV ++ '[' '!' nondestructive = nondestructive ']' ++ VIRTUAL_ENV=/home/omero/workspace/OMERO-test-integration/.venv3 ++ export VIRTUAL_ENV ++ _OLD_VIRTUAL_PATH=/opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ++ PATH=/home/omero/workspace/OMERO-test-integration/.venv3/bin:/opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ++ export PATH ++ '[' -n '' ']' ++ '[' -z '' ']' ++ _OLD_VIRTUAL_PS1= ++ PS1='(.venv3) ' ++ export PS1 ++ '[' -n /bin/bash -o -n '' ']' ++ hash -r + for x in *.tar.gz + pip install -U omero_dropbox-5.7.1.dev0.tar.gz Processing ./omero_dropbox-5.7.1.dev0.tar.gz Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Requirement already satisfied: omero-py in ./.venv3/lib/python3.9/site-packages (from omero-dropbox==5.7.1.dev0) (5.19.5) Requirement already satisfied: urllib3<2 in ./.venv3/lib/python3.9/site-packages (from omero-py->omero-dropbox==5.7.1.dev0) (1.26.20) Requirement already satisfied: appdirs in ./.venv3/lib/python3.9/site-packages (from omero-py->omero-dropbox==5.7.1.dev0) (1.4.4) Requirement already satisfied: future in ./.venv3/lib/python3.9/site-packages (from omero-py->omero-dropbox==5.7.1.dev0) (1.0.0) Requirement already satisfied: numpy<2 in ./.venv3/lib/python3.9/site-packages (from omero-py->omero-dropbox==5.7.1.dev0) (1.26.4) Requirement already satisfied: Pillow>=10.0.0 in ./.venv3/lib/python3.9/site-packages (from omero-py->omero-dropbox==5.7.1.dev0) (11.0.0) Requirement already satisfied: PyYAML in ./.venv3/lib/python3.9/site-packages (from omero-py->omero-dropbox==5.7.1.dev0) (6.0.2) Requirement already satisfied: zeroc-ice<3.7,>=3.6.5 in ./.venv3/lib/python3.9/site-packages (from omero-py->omero-dropbox==5.7.1.dev0) (3.6.5) Requirement already satisfied: requests in ./.venv3/lib/python3.9/site-packages (from omero-py->omero-dropbox==5.7.1.dev0) (2.32.3) Requirement already satisfied: portalocker in ./.venv3/lib/python3.9/site-packages (from omero-py->omero-dropbox==5.7.1.dev0) (2.10.1) Requirement already satisfied: charset-normalizer<4,>=2 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py->omero-dropbox==5.7.1.dev0) (3.4.0) Requirement already satisfied: idna<4,>=2.5 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py->omero-dropbox==5.7.1.dev0) (3.10) Requirement already satisfied: certifi>=2017.4.17 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py->omero-dropbox==5.7.1.dev0) (2024.8.30) Building wheels for collected packages: omero-dropbox Building wheel for omero-dropbox (pyproject.toml): started Building wheel for omero-dropbox (pyproject.toml): finished with status 'done' Created wheel for omero-dropbox: filename=omero_dropbox-5.7.1.dev0-py3-none-any.whl size=49058 sha256=b9f36780141ead4c22d71e31a3e2a67e42138d50c3bd7407c37280066f86135b Stored in directory: /home/omero/.cache/pip/wheels/ef/93/2d/d44d5be6e4d9a58eecdbeb6e3385179e95ae7b80719377e781 Successfully built omero-dropbox Installing collected packages: omero-dropbox Successfully installed omero-dropbox-5.7.1.dev0 + for x in *.tar.gz + pip install -U omero_marshal-0.9.1.dev0.tar.gz Processing ./omero_marshal-0.9.1.dev0.tar.gz Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Building wheels for collected packages: omero_marshal Building wheel for omero_marshal (pyproject.toml): started Building wheel for omero_marshal (pyproject.toml): finished with status 'done' Created wheel for omero_marshal: filename=omero_marshal-0.9.1.dev0-py3-none-any.whl size=88031 sha256=4c9bf3e15e27016ec989f73be26c398b5257bc26e00a472a2fae043aa767732e Stored in directory: /home/omero/.cache/pip/wheels/1a/3a/cd/b39392f5c8ba80a35bab4a0f80f93c7c4729cce6868750f4f3 Successfully built omero_marshal Installing collected packages: omero_marshal Attempting uninstall: omero_marshal Found existing installation: omero-marshal 0.9.0 Uninstalling omero-marshal-0.9.0: Successfully uninstalled omero-marshal-0.9.0 Successfully installed omero_marshal-0.9.1.dev0 + for x in *.tar.gz + pip install -U omero_py-5.19.6.dev0.tar.gz Processing ./omero_py-5.19.6.dev0.tar.gz Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Requirement already satisfied: urllib3<2 in ./.venv3/lib/python3.9/site-packages (from omero-py==5.19.6.dev0) (1.26.20) Requirement already satisfied: appdirs in ./.venv3/lib/python3.9/site-packages (from omero-py==5.19.6.dev0) (1.4.4) Requirement already satisfied: future in ./.venv3/lib/python3.9/site-packages (from omero-py==5.19.6.dev0) (1.0.0) Requirement already satisfied: numpy<2 in ./.venv3/lib/python3.9/site-packages (from omero-py==5.19.6.dev0) (1.26.4) Requirement already satisfied: Pillow>=10.0.0 in ./.venv3/lib/python3.9/site-packages (from omero-py==5.19.6.dev0) (11.0.0) Requirement already satisfied: PyYAML in ./.venv3/lib/python3.9/site-packages (from omero-py==5.19.6.dev0) (6.0.2) Requirement already satisfied: zeroc-ice<3.7,>=3.6.5 in ./.venv3/lib/python3.9/site-packages (from omero-py==5.19.6.dev0) (3.6.5) Requirement already satisfied: requests in ./.venv3/lib/python3.9/site-packages (from omero-py==5.19.6.dev0) (2.32.3) Requirement already satisfied: portalocker in ./.venv3/lib/python3.9/site-packages (from omero-py==5.19.6.dev0) (2.10.1) Requirement already satisfied: charset-normalizer<4,>=2 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py==5.19.6.dev0) (3.4.0) Requirement already satisfied: idna<4,>=2.5 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py==5.19.6.dev0) (3.10) Requirement already satisfied: certifi>=2017.4.17 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py==5.19.6.dev0) (2024.8.30) Building wheels for collected packages: omero-py Building wheel for omero-py (pyproject.toml): started Building wheel for omero-py (pyproject.toml): finished with status 'done' Created wheel for omero-py: filename=omero_py-5.19.6.dev0-py3-none-any.whl size=2783246 sha256=5b5398aa7821f32d41a95e49a731e8106c95f2219f2c567942a245e7df1a916e Stored in directory: /home/omero/.cache/pip/wheels/0e/51/53/0205c66cdbe856d4da23e6b9c5d613133b733ef358e57fa643 Successfully built omero-py Installing collected packages: omero-py Attempting uninstall: omero-py Found existing installation: omero-py 5.19.5 Uninstalling omero-py-5.19.5: Successfully uninstalled omero-py-5.19.5 Successfully installed omero-py-5.19.6.dev0 + for x in *.tar.gz + pip install -U omero_scripts-5.8.4.dev0.tar.gz Processing ./omero_scripts-5.8.4.dev0.tar.gz Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Building wheels for collected packages: omero-scripts Building wheel for omero-scripts (pyproject.toml): started Building wheel for omero-scripts (pyproject.toml): finished with status 'done' Created wheel for omero-scripts: filename=omero_scripts-5.8.4.dev0-py3-none-any.whl size=116645 sha256=2bda5159a5c068503b2e67929ec0fe376062d2435a8d92774910c83674dd796e Stored in directory: /home/omero/.cache/pip/wheels/28/3e/92/6522b883548147569f7375ed3132e6964d3645235d0b7e283f Successfully built omero-scripts Installing collected packages: omero-scripts Successfully installed omero-scripts-5.8.4.dev0 + for x in *.tar.gz + pip install -U omero_web-5.27.3.dev0.tar.gz Processing ./omero_web-5.27.3.dev0.tar.gz Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Requirement already satisfied: omero-py>=5.19.0 in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (5.19.6.dev0) Requirement already satisfied: concurrent-log-handler>=0.9.20 in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (0.9.25) Requirement already satisfied: Django<4.3,>=4.2.3 in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (4.2.16) Requirement already satisfied: django-pipeline==2.1.0 in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (2.1.0) Requirement already satisfied: django-cors-headers==3.7.0 in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (3.7.0) Collecting django-csp (from omero-web==5.27.3.dev0) Using cached django_csp-3.8-py3-none-any.whl.metadata (2.7 kB) Requirement already satisfied: whitenoise>=5.3.0 in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (6.7.0) Requirement already satisfied: gunicorn>=19.3 in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (23.0.0) Requirement already satisfied: omero-marshal>=0.7.0 in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (0.9.1.dev0) Requirement already satisfied: Pillow in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (11.0.0) Requirement already satisfied: pytz in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (2024.2) Requirement already satisfied: portalocker in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (2.10.1) Requirement already satisfied: packaging in ./.venv3/lib/python3.9/site-packages (from omero-web==5.27.3.dev0) (24.1) Requirement already satisfied: asgiref<4,>=3.6.0 in ./.venv3/lib/python3.9/site-packages (from Django<4.3,>=4.2.3->omero-web==5.27.3.dev0) (3.8.1) Requirement already satisfied: sqlparse>=0.3.1 in ./.venv3/lib/python3.9/site-packages (from Django<4.3,>=4.2.3->omero-web==5.27.3.dev0) (0.5.1) Requirement already satisfied: urllib3<2 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.19.0->omero-web==5.27.3.dev0) (1.26.20) Requirement already satisfied: appdirs in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.19.0->omero-web==5.27.3.dev0) (1.4.4) Requirement already satisfied: future in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.19.0->omero-web==5.27.3.dev0) (1.0.0) Requirement already satisfied: numpy<2 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.19.0->omero-web==5.27.3.dev0) (1.26.4) Requirement already satisfied: PyYAML in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.19.0->omero-web==5.27.3.dev0) (6.0.2) Requirement already satisfied: zeroc-ice<3.7,>=3.6.5 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.19.0->omero-web==5.27.3.dev0) (3.6.5) Requirement already satisfied: requests in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.19.0->omero-web==5.27.3.dev0) (2.32.3) Requirement already satisfied: typing-extensions>=4 in ./.venv3/lib/python3.9/site-packages (from asgiref<4,>=3.6.0->Django<4.3,>=4.2.3->omero-web==5.27.3.dev0) (4.12.2) Requirement already satisfied: charset-normalizer<4,>=2 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py>=5.19.0->omero-web==5.27.3.dev0) (3.4.0) Requirement already satisfied: idna<4,>=2.5 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py>=5.19.0->omero-web==5.27.3.dev0) (3.10) Requirement already satisfied: certifi>=2017.4.17 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py>=5.19.0->omero-web==5.27.3.dev0) (2024.8.30) Using cached django_csp-3.8-py3-none-any.whl (17 kB) Building wheels for collected packages: omero-web Building wheel for omero-web (pyproject.toml): started Building wheel for omero-web (pyproject.toml): finished with status 'done' Created wheel for omero-web: filename=omero_web-5.27.3.dev0-py3-none-any.whl size=2766670 sha256=428eda79ce11a7f7666434ff2662694655b53798b8b6c6eb06d5ca5d41993b5b Stored in directory: /home/omero/.cache/pip/wheels/f0/b3/64/dc77f8e1a2385b2f8ef248e63a59b8ba8d51c2c928e147e9d6 Successfully built omero-web Installing collected packages: django-csp, omero-web Attempting uninstall: omero-web Found existing installation: omero-web 5.27.2 Uninstalling omero-web-5.27.2: Successfully uninstalled omero-web-5.27.2 Successfully installed django-csp-3.8 omero-web-5.27.3.dev0 + pip install omero-certificates Collecting omero-certificates Using cached omero_certificates-0.3.2-py3-none-any.whl.metadata (3.7 kB) Requirement already satisfied: omero-py>=5.6.0 in ./.venv3/lib/python3.9/site-packages (from omero-certificates) (5.19.6.dev0) Collecting distro==1.8.0 (from omero-certificates) Using cached distro-1.8.0-py3-none-any.whl.metadata (6.9 kB) Requirement already satisfied: urllib3<2 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.6.0->omero-certificates) (1.26.20) Requirement already satisfied: appdirs in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.6.0->omero-certificates) (1.4.4) Requirement already satisfied: future in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.6.0->omero-certificates) (1.0.0) Requirement already satisfied: numpy<2 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.6.0->omero-certificates) (1.26.4) Requirement already satisfied: Pillow>=10.0.0 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.6.0->omero-certificates) (11.0.0) Requirement already satisfied: PyYAML in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.6.0->omero-certificates) (6.0.2) Requirement already satisfied: zeroc-ice<3.7,>=3.6.5 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.6.0->omero-certificates) (3.6.5) Requirement already satisfied: requests in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.6.0->omero-certificates) (2.32.3) Requirement already satisfied: portalocker in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.6.0->omero-certificates) (2.10.1) Requirement already satisfied: charset-normalizer<4,>=2 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py>=5.6.0->omero-certificates) (3.4.0) Requirement already satisfied: idna<4,>=2.5 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py>=5.6.0->omero-certificates) (3.10) Requirement already satisfied: certifi>=2017.4.17 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py>=5.6.0->omero-certificates) (2024.8.30) Using cached omero_certificates-0.3.2-py3-none-any.whl (13 kB) Using cached distro-1.8.0-py3-none-any.whl (20 kB) Installing collected packages: distro, omero-certificates Successfully installed distro-1.8.0 omero-certificates-0.3.2 + for x in omero-cli-duplicate + pip install git+https://github.com/snoopycrimecop/omero-cli-duplicate.git@merge_ci#egg=omero-cli-duplicate Collecting omero-cli-duplicate Cloning https://github.com/snoopycrimecop/omero-cli-duplicate.git (to revision merge_ci) to /tmp/pip-install-926juzop/omero-cli-duplicate_5467facaa7eb4c69a988ea041c3ab049 Running command git clone --filter=blob:none --quiet https://github.com/snoopycrimecop/omero-cli-duplicate.git /tmp/pip-install-926juzop/omero-cli-duplicate_5467facaa7eb4c69a988ea041c3ab049 Running command git checkout -b merge_ci --track origin/merge_ci Switched to a new branch 'merge_ci' branch 'merge_ci' set up to track 'origin/merge_ci'. Resolved https://github.com/snoopycrimecop/omero-cli-duplicate.git to commit 0b078cd4bd16b51bf8212680bc6059a0834b63be Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Requirement already satisfied: omero-py>=5.8 in ./.venv3/lib/python3.9/site-packages (from omero-cli-duplicate) (5.19.6.dev0) Requirement already satisfied: urllib3<2 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.8->omero-cli-duplicate) (1.26.20) Requirement already satisfied: appdirs in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.8->omero-cli-duplicate) (1.4.4) Requirement already satisfied: future in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.8->omero-cli-duplicate) (1.0.0) Requirement already satisfied: numpy<2 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.8->omero-cli-duplicate) (1.26.4) Requirement already satisfied: Pillow>=10.0.0 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.8->omero-cli-duplicate) (11.0.0) Requirement already satisfied: PyYAML in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.8->omero-cli-duplicate) (6.0.2) Requirement already satisfied: zeroc-ice<3.7,>=3.6.5 in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.8->omero-cli-duplicate) (3.6.5) Requirement already satisfied: requests in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.8->omero-cli-duplicate) (2.32.3) Requirement already satisfied: portalocker in ./.venv3/lib/python3.9/site-packages (from omero-py>=5.8->omero-cli-duplicate) (2.10.1) Requirement already satisfied: charset-normalizer<4,>=2 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py>=5.8->omero-cli-duplicate) (3.4.0) Requirement already satisfied: idna<4,>=2.5 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py>=5.8->omero-cli-duplicate) (3.10) Requirement already satisfied: certifi>=2017.4.17 in ./.venv3/lib/python3.9/site-packages (from requests->omero-py>=5.8->omero-cli-duplicate) (2024.8.30) Building wheels for collected packages: omero-cli-duplicate Building wheel for omero-cli-duplicate (pyproject.toml): started Building wheel for omero-cli-duplicate (pyproject.toml): finished with status 'done' Created wheel for omero-cli-duplicate: filename=omero_cli_duplicate-0.4.1.dev0-py3-none-any.whl size=11442 sha256=357f744930983c5c65aaca49e113478e0273643ffedb7770f333c9f648a63172 Stored in directory: /tmp/pip-ephem-wheel-cache-l5bqpgmw/wheels/f1/55/ac/0cb8b093a4c417e5157143839711e7613efa64eaa21c29ef84 Successfully built omero-cli-duplicate Installing collected packages: omero-cli-duplicate Successfully installed omero-cli-duplicate-0.4.1.dev0 + source /home/omero/workspace/OMERO-test-integration/src/docs/hudson/functions.sh + /home/omero/workspace/OMERO-test-integration/src/build.py build-default test-compile OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0 Buildfile: /home/omero/workspace/OMERO-test-integration/src/build.xml check-ivy: Creating /home/omero/workspace/OMERO-test-integration/src/etc/local.properties Created dir: /home/omero/workspace/OMERO-test-integration/src/target :: Apache Ivy 2.4.0 - 20141213170938 :: http://ant.apache.org/ivy/ :: :: loading settings :: file = /home/omero/workspace/OMERO-test-integration/src/etc/ivysettings.xml build-scons: Expanding: /home/omero/workspace/OMERO-test-integration/src/lib/repository/scons-local-2.1.0.zip into /home/omero/workspace/OMERO-test-integration/src/target/scons build: No sub-builds to iterate on Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroWeb... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroWeb... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroFS... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava... tools-init: no-op retrieve: :: Apache Ivy 2.4.0 - 20141213170938 :: http://ant.apache.org/ivy/ :: :: loading settings :: file = /home/omero/workspace/OMERO-test-integration/src/etc/ivysettings.xml WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.ivy.util.url.IvyAuthenticator (file:/home/omero/workspace/OMERO-test-integration/src/lib/repository/ivy-2.4.0.jar) to field java.net.Authenticator.theAuthenticator WARNING: Please consider reporting this to the maintainers of org.apache.ivy.util.url.IvyAuthenticator WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release prepare: Preparing: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/libs Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/.done Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/src Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/resources Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/test-classes Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/reports Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/resources Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/resources Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/resources Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/resources Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes pre-compile: Expanding: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/libs/glacier2.jar into /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated-classes Expanding: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/libs/ice.jar into /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated-classes Expanding: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/libs/icegrid.jar into /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated-classes Expanding: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/libs/icestorm.jar into /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated-classes Expanding: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/libs/ome-xml.jar into /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated-classes compile: no-op package: Building jar: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/omero_client.jar install: :: delivering :: omero#omero_client;working@bdaf9f08f5d1 :: 5.6.3-513-75ed6e6d79-ice36-ice36 :: integration :: Thu Oct 24 01:07:42 UTC 2024 delivering ivy file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/ivy.xml :: publishing :: omero#omero_client published omero_client to /home/omero/workspace/OMERO-test-integration/src/target/repository/omero/omero_client/5.6.3-513-75ed6e6d79-ice36-ice36/omero_client-5.6.3-513-75ed6e6d79-ice36-ice36.jar published ivy to /home/omero/workspace/OMERO-test-integration/src/target/repository/omero/omero_client/5.6.3-513-75ed6e6d79-ice36-ice36/omero_client-5.6.3-513-75ed6e6d79-ice36-ice36.xml Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy... retrieve: :: Apache Ivy 2.4.0 - 20141213170938 :: http://ant.apache.org/ivy/ :: :: loading settings :: file = /home/omero/workspace/OMERO-test-integration/src/etc/ivysettings.xml Entering /home/omero/workspace/OMERO-test-integration/src/components/tools... dist: Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/target/service-classes/META-INF Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/target/lib/server Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/components/tools/target/downloads/scripts Expanding: /home/omero/workspace/OMERO-test-integration/src/components/tools/target/downloads/scripts/omero_scripts-5.8.4.dev0.tar.gz into /home/omero/workspace/OMERO-test-integration/src/components/tools/target/downloads/scripts Copying 28 files to /home/omero/workspace/OMERO-test-integration/src/components/tools/target/lib/scripts Copying 28 files to /home/omero/workspace/OMERO-test-integration/src/dist Copied 9 empty directories to 1 empty directory under /home/omero/workspace/OMERO-test-integration/src/dist Entering /home/omero/workspace/OMERO-test-integration/src/components/tests... Entering /home/omero/workspace/OMERO-test-integration/src/components/tests/ui... copy-licenses: Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/dist Created dir: /home/omero/workspace/OMERO-test-integration/src/dist/share Copying 44 files to /home/omero/workspace/OMERO-test-integration/src/dist/share copy-history: Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/dist copy-etc: Copying 28 files to /home/omero/workspace/OMERO-test-integration/src/dist/etc Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/dist/etc/templates copy-sql: Copying 60 files to /home/omero/workspace/OMERO-test-integration/src/dist copy-server: conflict on /home/omero/workspace/OMERO-test-integration/src/dist/lib/server/janino.jar in [server]: 3.0.11 won copy-client: Created dir: /home/omero/workspace/OMERO-test-integration/src/dist/lib/client conflict on /home/omero/workspace/OMERO-test-integration/src/dist/lib/client/janino.jar in [client]: 3.0.11 won update-version: Copying 1 file to /home/omero/workspace/OMERO-test-integration/src/dist/etc create-workdirs: Created dir: /home/omero/workspace/OMERO-test-integration/src/dist/etc/grid Created dir: /home/omero/workspace/OMERO-test-integration/src/dist/var copy-luts: Created dir: /home/omero/workspace/OMERO-test-integration/src/dist/lib/scripts/luts Copying 39 files to /home/omero/workspace/OMERO-test-integration/src/dist/lib/scripts/luts test-compile: :: loading settings :: file = /home/omero/workspace/OMERO-test-integration/src/etc/ivysettings.xml Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroFS... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroWeb... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroWeb... Entering /home/omero/workspace/OMERO-test-integration/src/components/tests/ui... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava... testng-init: :: Apache Ivy 2.4.0 - 20141213170938 :: http://ant.apache.org/ivy/ :: :: loading settings :: file = /home/omero/workspace/OMERO-test-integration/src/etc/ivysettings.xml 01:08:07,993 |-INFO in ch.qos.logback.classic.LoggerContext[default] - This is logback-classic version 1.3.14 01:08:07,996 |-INFO in ch.qos.logback.classic.util.ContextInitializer@196ca821 - No custom configurators were discovered as a service. 01:08:07,996 |-INFO in ch.qos.logback.classic.util.ContextInitializer@196ca821 - Trying to configure with ch.qos.logback.classic.joran.SerializedModelConfigurator 01:08:08,002 |-INFO in ch.qos.logback.classic.util.ContextInitializer@196ca821 - Constructed configurator of type class ch.qos.logback.classic.joran.SerializedModelConfigurator 01:08:08,003 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.scmo] 01:08:08,003 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.scmo] 01:08:08,005 |-INFO in ch.qos.logback.classic.util.ContextInitializer@196ca821 - ch.qos.logback.classic.joran.SerializedModelConfigurator.configure() call lasted 2 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 01:08:08,005 |-INFO in ch.qos.logback.classic.util.ContextInitializer@196ca821 - Trying to configure with ch.qos.logback.classic.util.DefaultJoranConfigurator 01:08:08,006 |-INFO in ch.qos.logback.classic.util.ContextInitializer@196ca821 - Constructed configurator of type class ch.qos.logback.classic.util.DefaultJoranConfigurator 01:08:08,007 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml] 01:08:08,007 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [logback.xml] at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes/logback.xml] 01:08:08,008 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@373636f3 - Resource [logback.xml] occurs multiple times on the classpath. 01:08:08,008 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@373636f3 - Resource [logback.xml] occurs at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes/logback.xml] 01:08:08,008 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@373636f3 - Resource [logback.xml] occurs at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/resources/logback.xml] 01:08:08,268 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [stderr] 01:08:08,268 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 01:08:08,288 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 01:08:08,291 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@2ca413f6 - As of version 1.2.0 "immediateFlush" property should be set within the enclosing Appender. 01:08:08,291 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@2ca413f6 - Please move "immediateFlush" property into the enclosing appender. 01:08:08,358 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@2ca413f6 - Setting the "immediateFlush" property of the enclosing appender to true 01:08:08,358 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [velocity] to ERROR 01:08:08,359 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org] to ERROR 01:08:08,359 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [ome] to ERROR 01:08:08,359 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [loci] to ERROR 01:08:08,359 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to WARN 01:08:08,359 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [stderr] to Logger[ROOT] 01:08:08,360 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@66e478c7 - End of configuration. 01:08:08,362 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@1a35f99a - Registering current configuration as safe fallback point 01:08:08,362 |-INFO in ch.qos.logback.classic.util.ContextInitializer@196ca821 - ch.qos.logback.classic.util.DefaultJoranConfigurator.configure() call lasted 356 milliseconds. ExecutionStatus=DO_NOT_INVOKE_NEXT_IF_ANY lifecycle.test-compile: Copying 3 files to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/test-classes Compiling 85 source files to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/test-classes warning: [options] bootstrap class path not set in conjunction with -source 8 /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/AbstractServerTest.java:61: warning: [deprecation] RawAccessRequest in omero.grid has been deprecated import omero.grid.RawAccessRequest; ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/DeleteServiceFilesTest.java:32: warning: [deprecation] RawAccessRequest in omero.grid has been deprecated import omero.grid.RawAccessRequest; ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:53: warning: [deprecation] RawAccessRequest in omero.grid has been deprecated import omero.grid.RawAccessRequest; ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:31: warning: [deprecation] ImportJob in omero.model has been deprecated import omero.model.ImportJob; ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:32: warning: [deprecation] ImportJobI in omero.model has been deprecated import omero.model.ImportJobI; ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ModelMockFactory.java:1206: warning: [deprecation] newInstance() in Class has been deprecated IObject link = (IObject) linkClass.newInstance(); ^ where T is a type-variable: T extends Object declared in class Class /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/CreatePojosFixture2.java:269: warning: [deprecation] newInstance() in Class has been deprecated T copy = (T) obj.getClass().newInstance(); ^ where T is a type-variable: T extends Object declared in class Class /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/DeleteServiceFilesTest.java:418: warning: [deprecation] RawAccessRequest in omero.grid has been deprecated RawAccessRequest raw = new RawAccessRequest(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/DeleteServiceFilesTest.java:418: warning: [deprecation] RawAccessRequest in omero.grid has been deprecated RawAccessRequest raw = new RawAccessRequest(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/DeleteServiceFilesTest.java:419: warning: [deprecation] repoUuid in RawAccessRequest has been deprecated raw.repoUuid = legacy.root().getHash().getValue(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/DeleteServiceFilesTest.java:420: warning: [deprecation] command in RawAccessRequest has been deprecated raw.command = "rm"; ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/DeleteServiceFilesTest.java:421: warning: [deprecation] args in RawAccessRequest has been deprecated raw.args = Arrays.asList(path); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/DiskUsageTest.java:156: warning: [deprecation] newInstance() in Class has been deprecated final IObject link = linkClass.newInstance(); ^ where T is a type-variable: T extends Object declared in class Class /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/DuplicationTest.java:1332: warning: [deprecation] newInstance() in Class has been deprecated originalLink.setChild(annotationClass.newInstance()); ^ where T is a type-variable: T extends Object declared in class Class /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ImporterTest.java:1088: warning: [deprecation] findByImage(long,RoiOptions) in IRoiPrx has been deprecated RoiResult r = svc.findByImage(id, new RoiOptions()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:900: warning: [deprecation] RawAccessRequest in omero.grid has been deprecated final RawAccessRequest request = new RawAccessRequest(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:900: warning: [deprecation] RawAccessRequest in omero.grid has been deprecated final RawAccessRequest request = new RawAccessRequest(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:901: warning: [deprecation] repoUuid in RawAccessRequest has been deprecated request.repoUuid = repo.root().getHash().getValue(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:902: warning: [deprecation] command in RawAccessRequest has been deprecated request.command = "exists"; ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:903: warning: [deprecation] args in RawAccessRequest has been deprecated request.args = Collections.singletonList(dirPath); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:939: warning: [deprecation] RawAccessRequest in omero.grid has been deprecated final RawAccessRequest request = new RawAccessRequest(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:939: warning: [deprecation] RawAccessRequest in omero.grid has been deprecated final RawAccessRequest request = new RawAccessRequest(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:940: warning: [deprecation] repoUuid in RawAccessRequest has been deprecated request.repoUuid = repo.root().getHash().getValue(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:941: warning: [deprecation] command in RawAccessRequest has been deprecated request.command = "exists"; ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ManagedRepositoryTest.java:942: warning: [deprecation] args in RawAccessRequest has been deprecated request.args = Collections.singletonList(dirName); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:255: warning: [deprecation] ImportJob in omero.model has been deprecated final ImportJob importJob = new ImportJobI(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:255: warning: [deprecation] ImportJobI in omero.model has been deprecated final ImportJob importJob = new ImportJobI(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:266: warning: [deprecation] setImageName(RString) in ImportJob has been deprecated importJob.setImageName(omero.rtypes.rstring(name)); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:268: warning: [deprecation] setImageDescription(RString) in ImportJob has been deprecated importJob.setImageDescription(omero.rtypes.rstring(desc)); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:273: warning: [deprecation] ImportJob in omero.model has been deprecated ImportJob sent = (ImportJob) iUpdate.saveAndReturnObject(importJob); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:273: warning: [deprecation] ImportJob in omero.model has been deprecated ImportJob sent = (ImportJob) iUpdate.saveAndReturnObject(importJob); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:274: warning: [deprecation] getImageName() in ImportJob has been deprecated final String savedName = sent.getImageName().getValue().toString(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:275: warning: [deprecation] getImageDescription() in ImportJob has been deprecated final String savedDesc = sent.getImageDescription().getValue().toString(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:277: warning: [deprecation] ImportJob in omero.model has been deprecated final ImportJob retrievedImportJob = (ImportJob) iQuery.get("ImportJob", id); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:277: warning: [deprecation] ImportJob in omero.model has been deprecated final ImportJob retrievedImportJob = (ImportJob) iQuery.get("ImportJob", id); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:278: warning: [deprecation] getImageName() in ImportJob has been deprecated final String retrievedName = retrievedImportJob.getImageName().getValue().toString(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/ObjectPropertiesTest.java:279: warning: [deprecation] getImageDescription() in ImportJob has been deprecated final String retrievedDesc = retrievedImportJob.getImageDescription().getValue().toString(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:141: warning: [deprecation] findByImage(long,RoiOptions) in IRoiPrx has been deprecated RoiResult r = svc.findByImage(image.getId().getValue(), ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:190: warning: [deprecation] getRoiMeasurements(long,RoiOptions) in IRoiPrx has been deprecated List l = svc.getRoiMeasurements(image.getId().getValue(), ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:220: warning: [deprecation] getRoiMeasurements(long,RoiOptions) in IRoiPrx has been deprecated l = svc.getRoiMeasurements(image.getId().getValue(), options); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:241: warning: [deprecation] getRoiMeasurements(long,RoiOptions) in IRoiPrx has been deprecated l = svc.getRoiMeasurements(image.getId().getValue(), options); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:313: warning: [deprecation] getRoiMeasurements(long,RoiOptions) in IRoiPrx has been deprecated List l = svc.getRoiMeasurements(image.getId().getValue(), ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:320: warning: [deprecation] getMeasuredRoisMap(long,List,RoiOptions) in IRoiPrx has been deprecated Map values = svc.getMeasuredRoisMap(image.getId() ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:386: warning: [deprecation] getRoiMeasurements(long,RoiOptions) in IRoiPrx has been deprecated List l = svc.getRoiMeasurements(image.getId().getValue(), ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:390: warning: [deprecation] getTable(long) in IRoiPrx has been deprecated table = svc.getTable(f.getId().getValue()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:412: warning: [deprecation] findByImage(long,RoiOptions) in IRoiPrx has been deprecated RoiResult r = svc.findByImage(image.getId().getValue(), ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:426: warning: [deprecation] findByRoi(long,RoiOptions) in IRoiPrx has been deprecated r = svc.findByRoi(r1.getId().getValue(), new RoiOptions()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:437: warning: [deprecation] findByRoi(long,RoiOptions) in IRoiPrx has been deprecated r = svc.findByRoi(r2.getId().getValue(), new RoiOptions()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:448: warning: [deprecation] findByPlane(long,int,int,RoiOptions) in IRoiPrx has been deprecated r = svc.findByPlane(image.getId().getValue(), 1, 0, new RoiOptions()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:459: warning: [deprecation] findByPlane(long,int,int,RoiOptions) in IRoiPrx has been deprecated r = svc.findByPlane(image.getId().getValue(), 1, 1, new RoiOptions()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:733: warning: [deprecation] Integer(int) in Integer has been deprecated null, new Integer(0), new Integer(0), new int[] {0}, "null ids"}); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:733: warning: [deprecation] Integer(int) in Integer has been deprecated null, new Integer(0), new Integer(0), new int[] {0}, "null ids"}); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:735: warning: [deprecation] Integer(int) in Integer has been deprecated new ArrayList(), new Integer(0), new Integer(0), new int[] {0}, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:735: warning: [deprecation] Integer(int) in Integer has been deprecated new ArrayList(), new Integer(0), new Integer(0), new int[] {0}, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:738: warning: [deprecation] Long(long) in Long has been deprecated Arrays.asList(new Long[] {new Long (-1)}), new Integer(0), new Integer(0), ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:738: warning: [deprecation] Integer(int) in Integer has been deprecated Arrays.asList(new Long[] {new Long (-1)}), new Integer(0), new Integer(0), ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:738: warning: [deprecation] Integer(int) in Integer has been deprecated Arrays.asList(new Long[] {new Long (-1)}), new Integer(0), new Integer(0), ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:740: warning: [deprecation] Integer(int) in Integer has been deprecated inputs.add(new Object[] { ids, new Integer(10), new Integer(10), new int[] {0}, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:740: warning: [deprecation] Integer(int) in Integer has been deprecated inputs.add(new Object[] { ids, new Integer(10), new Integer(10), new int[] {0}, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:742: warning: [deprecation] Integer(int) in Integer has been deprecated inputs.add(new Object[] { ids, new Integer(0), new Integer(-10), new int[] {0}, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:742: warning: [deprecation] Integer(int) in Integer has been deprecated inputs.add(new Object[] { ids, new Integer(0), new Integer(-10), new int[] {0}, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:744: warning: [deprecation] Integer(int) in Integer has been deprecated inputs.add(new Object[] { ids, new Integer(0), new Integer(0), new int[] {200}, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:744: warning: [deprecation] Integer(int) in Integer has been deprecated inputs.add(new Object[] { ids, new Integer(0), new Integer(0), new int[] {200}, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:746: warning: [deprecation] Integer(int) in Integer has been deprecated inputs.add(new Object[] { ids, new Integer(0), new Integer(0), null, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:746: warning: [deprecation] Integer(int) in Integer has been deprecated inputs.add(new Object[] { ids, new Integer(0), new Integer(0), null, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:748: warning: [deprecation] Integer(int) in Integer has been deprecated inputs.add(new Object[] { ids, new Integer(0), new Integer(0), new int[] {}, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:748: warning: [deprecation] Integer(int) in Integer has been deprecated inputs.add(new Object[] { ids, new Integer(0), new Integer(0), new int[] {}, ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:755: warning: [deprecation] getShapeStatsRestricted(List,int,int,int[]) in IRoiPrx has been deprecated svc.getShapeStatsRestricted( ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/RoiServiceTest.java:764: warning: [deprecation] getShapeStatsRestricted(List,int,int,int[]) in IRoiPrx has been deprecated final ShapeStats [] stats = svc.getShapeStatsRestricted(ids, 0, 0, new int[] {0}); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/chgrp/HierarchyMoveTest.java:1030: warning: [deprecation] setRelatedTo(Pixels) in Pixels has been deprecated projection.getPrimaryPixels().setRelatedTo((Pixels) original.getPrimaryPixels().proxy()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/AnnotationDeleteTest.java:478: warning: [deprecation] typesToIgnore in Delete2 has been deprecated request.typesToIgnore = Collections.singletonList("TagAnnotation"); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/AnnotationDeleteTest.java:495: warning: [deprecation] typesToIgnore in Delete2 has been deprecated request.typesToIgnore = Collections.singletonList("IAnnotationLink"); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/AnnotationDeleteTest.java:501: warning: [deprecation] typesToIgnore in Delete2 has been deprecated request.typesToIgnore = Collections.singletonList("Annotation"); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/AnnotationDeleteTest.java:547: warning: [deprecation] typesToIgnore in Delete2 has been deprecated request.typesToIgnore = Collections.singletonList("Annotation"); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/HierarchyDeleteTest.java:537: warning: [deprecation] setRelatedTo(Pixels) in Pixels has been deprecated projection.getPrimaryPixels().setRelatedTo((Pixels) original.getPrimaryPixels().proxy()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RelatedToTest.java:37: warning: [deprecation] setRelatedTo(Pixels) in Pixels has been deprecated p2.setRelatedTo(p1); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RelatedToTest.java:39: warning: [deprecation] getRelatedTo() in Pixels has been deprecated Assert.assertEquals(p1.getId(), p2.getRelatedTo().getId()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RelatedToTest.java:66: warning: [deprecation] setRelatedTo(Pixels) in Pixels has been deprecated pixels1.setRelatedTo(pixels2); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RelatedToTest.java:68: warning: [deprecation] getRelatedTo() in Pixels has been deprecated Pixels pixels = pixels1.getRelatedTo(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RelatedToTest.java:88: warning: [deprecation] getRelatedTo() in Pixels has been deprecated Assert.assertNull(pixels1.getRelatedTo()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RelatedToTest.java:106: warning: [deprecation] setRelatedTo(Pixels) in Pixels has been deprecated pixels1.setRelatedTo(pixels2); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RelatedToTest.java:108: warning: [deprecation] getRelatedTo() in Pixels has been deprecated Pixels pixels = pixels1.getRelatedTo(); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RelatedToTest.java:128: warning: [deprecation] getRelatedTo() in Pixels has been deprecated Assert.assertNull(pixels1.getRelatedTo()); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RoiDeleteTest.java:113: warning: [deprecation] getRoiMeasurements(long,RoiOptions) in IRoiPrx has been deprecated List l = svc.getRoiMeasurements(image.getId().getValue(), ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RoiDeleteTest.java:147: warning: [deprecation] getRoiMeasurements(long,RoiOptions) in IRoiPrx has been deprecated l = svc.getRoiMeasurements(image.getId().getValue(), options); ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RoiDeleteTest.java:186: warning: [deprecation] getRoiMeasurements(long,RoiOptions) in IRoiPrx has been deprecated List l = svc.getRoiMeasurements(image.getId().getValue(), ^ /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/test/integration/delete/RoiDeleteTest.java:221: warning: [deprecation] getRoiMeasurements(long,RoiOptions) in IRoiPrx has been deprecated l = svc.getRoiMeasurements(image.getId().getValue(), options); ^ Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. 88 warnings Deleting: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/_omero_build_566371656.tmp Building jar: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/OmeroJava-test.jar :: loading settings :: file = /home/omero/workspace/OMERO-test-integration/src/etc/ivysettings.xml :: delivering :: omero#OmeroJava-test;working@bdaf9f08f5d1 :: 5.6.3-513-75ed6e6d79-ice36-ice36 :: integration :: Thu Oct 24 01:08:04 UTC 2024 delivering ivy file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/OmeroJava-test.xml :: publishing :: omero#OmeroJava-test published OmeroJava-test to /home/omero/workspace/OMERO-test-integration/src/target/test-repository/OmeroJava-test-5.6.3-513-75ed6e6d79-ice36-ice36.jar published ivy to /home/omero/workspace/OMERO-test-integration/src/target/test-repository/OmeroJava-test-5.6.3-513-75ed6e6d79-ice36-ice36.xml BUILD SUCCESSFUL Total time: 33 seconds + OMERO_DIST=/home/omero/workspace/OMERO-test-integration/src/dist + export OMERODIR=/home/omero/workspace/OMERO-test-integration/src/dist + OMERODIR=/home/omero/workspace/OMERO-test-integration/src/dist + omero config set omero.db.name OMERO-test-integration + omero config set omero.db.host pg + omero config set omero.db.user postgres + omero config set omero.db.poolsize 50 + omero config set omero.jvmcfg.max_system_memory.blitz 64000 + omero config set omero.data.dir /home/omero/workspace/OMERO-test-integration/data + omero config set omero.ports.prefix 1 + omero config set omero.web.server_list '[["testintegration",14064,"testintegration"]]' + omero certificates OpenSSL 3.0.7 1 Nov 2022 (Library: OpenSSL 3.0.7 1 Nov 2022) certificates created: /home/omero/workspace/OMERO-test-integration/data/certs/server.key /home/omero/workspace/OMERO-test-integration/data/certs/server.pem /home/omero/workspace/OMERO-test-integration/data/certs/server.p12 + createdb -h pg -U postgres OMERO-test-integration + omero db script -f dbsetup.sql '' '' omero WARNING: Positional arguments are deprecated Using OMERO5.4 for version Using 0 for patch Using password from commandline + psql -h pg -U postgres -d OMERO-test-integration -f dbsetup.sql BEGIN CREATE FUNCTION CREATE FUNCTION assert_db_server_prerequisites -------------------------------- (1 row) DROP FUNCTION DROP FUNCTION CREATE DOMAIN CREATE DOMAIN CREATE DOMAIN CREATE DOMAIN CREATE DOMAIN CREATE TYPE CREATE TYPE CREATE TYPE CREATE TYPE CREATE TYPE CREATE TYPE CREATE TYPE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE CREATE TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE psql:dbsetup.sql:2842: NOTICE: identifier "fkcontraststretchingcontext_codomainmapcontext_id_codomainmapcontext" will be truncated to "fkcontraststretchingcontext_codomainmapcontext_id_codomainmapco" ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE psql:dbsetup.sql:4712: NOTICE: identifier "fklogicalchannel_photometricinterpretation_photometricinterpretation" will be truncated to "fklogicalchannel_photometricinterpretation_photometricinterpret" ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE psql:dbsetup.sql:5697: NOTICE: identifier "fkreverseintensitycontext_codomainmapcontext_id_codomainmapcontext" will be truncated to "fkreverseintensitycontext_codomainmapcontext_id_codomainmapcont" ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE CREATE TABLE DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW DROP TABLE CREATE VIEW CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE FUNCTION SET CONSTRAINTS CREATE FUNCTION CREATE SEQUENCE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE CREATE INDEX CREATE FUNCTION CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE SEQUENCE INSERT 0 1 CREATE FUNCTION CREATE FUNCTION CREATE FUNCTION CREATE TABLE CREATE INDEX CREATE INDEX CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE FUNCTION CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE FUNCTION CREATE FUNCTION CREATE FUNCTION CREATE FUNCTION CREATE FUNCTION CREATE FUNCTION CREATE FUNCTION CREATE FUNCTION CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER ALTER TABLE ALTER TABLE ALTER TABLE INSERT 0 1 ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 UPDATE 1 UPDATE 1 ALTER TABLE ALTER TABLE ALTER TABLE INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 INSERT 0 1 UPDATE 1 UPDATE 1 UPDATE 1 UPDATE 1 UPDATE 1 UPDATE 1 UPDATE 1 UPDATE 1 UPDATE 1 UPDATE 1 UPDATE 1 ALTER TABLE CREATE FUNCTION CREATE TABLE INSERT 0 1 ALTER TABLE ALTER TABLE ALTER TABLE CREATE INDEX ALTER TABLE ALTER TABLE CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX ALTER TABLE CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE TABLE INSERT 0 1 INSERT 0 1 CREATE TABLE INSERT 0 1 CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE TRIGGER CREATE TABLE CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE INDEX CREATE FUNCTION CREATE TRIGGER ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE CREATE FUNCTION CREATE TRIGGER CREATE FUNCTION CREATE FUNCTION CREATE FUNCTION CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER DROP TABLE DROP TABLE DROP TABLE DROP TABLE DROP TABLE CREATE VIEW CREATE VIEW CREATE VIEW CREATE VIEW CREATE VIEW ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE ALTER TABLE CREATE FUNCTION CREATE TRIGGER CREATE TABLE CREATE INDEX CREATE FUNCTION CREATE FUNCTION CREATE FUNCTION CREATE FUNCTION CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE TRIGGER CREATE FUNCTION CREATE FUNCTION CREATE TRIGGER CREATE TRIGGER CREATE INDEX UPDATE 1 COMMIT + rm dbsetup.sql + export OMERO_PROFILE=/home/omero/workspace/OMERO-test-integration/config + OMERO_PROFILE=/home/omero/workspace/OMERO-test-integration/config + mkdir -p /home/omero/workspace/OMERO-test-integration/config/blitz + cp /home/omero/workspace/OMERO-test-integration/src/dist/etc/blitz/mail-senders.example /home/omero/workspace/OMERO-test-integration/src/dist/etc/blitz/mail-server.example /home/omero/workspace/OMERO-test-integration/config/blitz + for f in $OMERO_PROFILE/blitz/* ++ basename /home/omero/workspace/OMERO-test-integration/config/blitz/mail-senders.example .example + mv /home/omero/workspace/OMERO-test-integration/config/blitz/mail-senders.example /home/omero/workspace/OMERO-test-integration/config/blitz/mail-senders.xml + for f in $OMERO_PROFILE/blitz/* ++ basename /home/omero/workspace/OMERO-test-integration/config/blitz/mail-server.example .example + mv /home/omero/workspace/OMERO-test-integration/config/blitz/mail-server.example /home/omero/workspace/OMERO-test-integration/config/blitz/mail-server.xml + omero config set omero.mail.config true + omero config set omero.mail.fake true + omero config set omero.mail.port 2525 + BUILD_ID=DONT_KILL_ME + omero admin start No descriptor given. Using etc/grid/default.xml Creating /home/omero/workspace/OMERO-test-integration/src/dist/var/master Initializing /home/omero/workspace/OMERO-test-integration/src/dist/var/log Creating /home/omero/workspace/OMERO-test-integration/src/dist/var/registry Waiting on startup. Use CTRL-C to exit + omero admin waitup Waiting on startup. Use CTRL-C to exit + omero admin diagnostics ERROR:omero.util.UpgradeCheck:HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=5.19.6.dev0&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) ================================================================================ OMERO Diagnostics (admin) ================================================================================ Commands: java -version 11.0.24 (/usr/bin/java) Commands: python -V 3.9.18 (/home/omero/workspace/OMERO-test-integration/.venv3/bin/python) Commands: icegridnode --version 3.6.5 (/opt/ice-3.6.5/bin/icegridnode) Commands: icegridadmin --version 3.6.5 (/opt/ice-3.6.5/bin/icegridadmin) Commands: psql --version 16.4 (/usr/bin/psql) Commands: openssl version 3.0.71 (/usr/bin/openssl) Component: OMERO.py 5.19.6.dev0 Component: OMERO.server 5.6.3-513-75ed6e6d79-ice36-ice36 Server: icegridnode running Server: Blitz-0 active (pid = 374535, enabled) Server: DropBox active (pid = 374561, enabled) Server: FileServer active (pid = 374568, enabled) Server: Indexer-0 active (pid = 374611, enabled) Server: MonitorServer active (pid = 374569, enabled) Server: OMERO.Glacier2 active (pid = 374609, enabled) Server: OMERO.IceStorm active (pid = 374571, enabled) Server: PixelData-0 active (pid = 374570, enabled) Server: Processor-0 activating (enabled) Server: Tables-0 activating (enabled) Server: TestDropBox inactive (enabled) Log dir: /home/omero/workspace/OMERO-test-integration/src/dist/var/log exists Log files: Blitz-0.log 133.7 KB errors=1 warnings=2 Log files: DropBox.log 1.4 KB Log files: FileServer.log 114 B Log files: Indexer-0.log 2.9 KB errors=0 warnings=2 Log files: MonitorServer.log 1.2 KB Log files: PixelData-0.log 3.7 KB errors=0 warnings=2 Log files: Processor-0.log 592 B Log files: Tables-0.log 841 B Log files: TestDropBox.log n/a Log files: master.err 2.1 KB Log files: master.out 41.5 KB Log files: Total size 0.19 MB Environment:OMERO_HOME=(unset) Environment:OMERODIR=/home/omero/workspace/OMERO-test-integration/src/dist Environment:OMERO_NODE=(unset) Environment:OMERO_MASTER=(unset) Environment:OMERO_USERDIR=(unset) Environment:OMERO_TMPDIR=(unset) Environment:PATH=/home/omero/workspace/OMERO-test-integration/.venv3/bin:/opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin Environment:PYTHONPATH=(unset) Environment:ICE_HOME=/opt/ice-3.6.5 Environment:LD_LIBRARY_PATH=(unset) Environment:DYLD_LIBRARY_PATH=(unset) OMERO SSL port:14064 OMERO TCP port:14063 OMERO data dir:'/home/omero/workspace/OMERO-test-integration/data' Exists? True Is writable? True OMERO temp dir:'/home/omero/omero/tmp' Exists? True Is writable? True (Size: 485861) JVM settings: Blitz-${index} -Xmx9600m -XX:MaxPermSize=1g -XX:+IgnoreUnrecognizedVMOptions JVM settings: Indexer-${index} -Xmx4800m -XX:MaxPermSize=1g -XX:+IgnoreUnrecognizedVMOptions JVM settings: PixelData-${index} -Xmx7200m -XX:MaxPermSize=1g -XX:+IgnoreUnrecognizedVMOptions JVM settings: Repository-${index} -Xmx4800m -XX:MaxPermSize=1g -XX:+IgnoreUnrecognizedVMOptions Jar: lib/server/formats-api.jar Bio-Formats API 8.0.0-SNAPSHOT 24 October 2024 62d359b4bc191e66a0e75fbc407c2b440d35be57 Jar: lib/server/formats-bsd.jar BSD Bio-Formats readers and writers 8.0.0-SNAPSHOT 24 October 2024 62d359b4bc191e66a0e75fbc407c2b440d35be57 Jar: lib/server/formats-gpl.jar Bio-Formats library 8.0.0-SNAPSHOT 24 October 2024 62d359b4bc191e66a0e75fbc407c2b440d35be57 Jar: lib/server/ome-codecs.jar OME Codecs 1.0.4-SNAPSHOT 24 October 2024 acd1ac3c9a4aa7d1ec69c8106e23103675af5952 Jar: lib/server/ome-common.jar OME Common Java 6.0.25-SNAPSHOT 24 October 2024 acd1ac3c9a4aa7d1ec69c8106e23103675af5952 Jar: lib/server/ome-jai.jar OME JAI 0.1.5-SNAPSHOT 24 October 2024 acd1ac3c9a4aa7d1ec69c8106e23103675af5952 Jar: lib/server/ome-mdbtools.jar MDB Tools (Java port) 5.3.4-SNAPSHOT 24 October 2024 acd1ac3c9a4aa7d1ec69c8106e23103675af5952 Jar: lib/server/ome-poi.jar OME POI 5.3.10-SNAPSHOT 24 October 2024 acd1ac3c9a4aa7d1ec69c8106e23103675af5952 Jar: lib/server/ome-xml.jar OME XML library 6.3.7-SNAPSHOT 24 October 2024 acd1ac3c9a4aa7d1ec69c8106e23103675af5952 Jar: lib/server/omero-blitz.jar jar 5.7.5-SNAPSHOT Jar: lib/server/omero-common.jar jar 5.6.8-SNAPSHOT Jar: lib/server/omero-gateway.jar jar 5.9.4-SNAPSHOT Jar: lib/server/omero-model.jar jar 5.6.16-SNAPSHOT Jar: lib/server/omero-renderer.jar jar 5.5.18-SNAPSHOT Jar: lib/server/omero-romio.jar jar 5.7.8-SNAPSHOT Jar: lib/server/omero-server.jar jar 5.6.13-SNAPSHOT + export OMERO_SESSION_DIR=/tmp/OMERO-test-integration/206 + OMERO_SESSION_DIR=/tmp/OMERO-test-integration/206 + export ICE_CONFIG=/home/omero/workspace/OMERO-test-integration/src/dist/etc/ice.config + ICE_CONFIG=/home/omero/workspace/OMERO-test-integration/src/dist/etc/ice.config + echo Running the integration tests with -Dtestng.useDefaultListeners=true Running the integration tests with -Dtestng.useDefaultListeners=true + /home/omero/workspace/OMERO-test-integration/src/build.py -f components/tools/OmeroJava/build.xml -Dtestng.useDefaultListeners=true -Dtestreports.dir=target/reports/integration integration OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0 Buildfile: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/build.xml Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava... testng-init: :: Apache Ivy 2.4.0 - 20141213170938 :: http://ant.apache.org/ivy/ :: :: loading settings :: file = /home/omero/workspace/OMERO-test-integration/src/etc/ivysettings.xml 01:09:15,760 |-INFO in ch.qos.logback.classic.LoggerContext[default] - This is logback-classic version 1.3.14 01:09:15,763 |-INFO in ch.qos.logback.classic.util.ContextInitializer@1e40fbb3 - No custom configurators were discovered as a service. 01:09:15,763 |-INFO in ch.qos.logback.classic.util.ContextInitializer@1e40fbb3 - Trying to configure with ch.qos.logback.classic.joran.SerializedModelConfigurator 01:09:15,767 |-INFO in ch.qos.logback.classic.util.ContextInitializer@1e40fbb3 - Constructed configurator of type class ch.qos.logback.classic.joran.SerializedModelConfigurator 01:09:15,767 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.scmo] 01:09:15,768 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.scmo] 01:09:15,768 |-INFO in ch.qos.logback.classic.util.ContextInitializer@1e40fbb3 - ch.qos.logback.classic.joran.SerializedModelConfigurator.configure() call lasted 1 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 01:09:15,769 |-INFO in ch.qos.logback.classic.util.ContextInitializer@1e40fbb3 - Trying to configure with ch.qos.logback.classic.util.DefaultJoranConfigurator 01:09:15,770 |-INFO in ch.qos.logback.classic.util.ContextInitializer@1e40fbb3 - Constructed configurator of type class ch.qos.logback.classic.util.DefaultJoranConfigurator 01:09:15,771 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml] 01:09:15,771 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [logback.xml] at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes/logback.xml] 01:09:15,772 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@1b560eb0 - Resource [logback.xml] occurs multiple times on the classpath. 01:09:15,772 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@1b560eb0 - Resource [logback.xml] occurs at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes/logback.xml] 01:09:15,772 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@1b560eb0 - Resource [logback.xml] occurs at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/resources/logback.xml] 01:09:15,985 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [stderr] 01:09:15,985 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 01:09:16,002 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 01:09:16,004 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@9e02f84 - As of version 1.2.0 "immediateFlush" property should be set within the enclosing Appender. 01:09:16,004 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@9e02f84 - Please move "immediateFlush" property into the enclosing appender. 01:09:16,064 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@9e02f84 - Setting the "immediateFlush" property of the enclosing appender to true 01:09:16,065 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [velocity] to ERROR 01:09:16,065 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org] to ERROR 01:09:16,065 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [ome] to ERROR 01:09:16,065 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [loci] to ERROR 01:09:16,065 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to WARN 01:09:16,065 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [stderr] to Logger[ROOT] 01:09:16,066 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@1e6060f1 - End of configuration. 01:09:16,068 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@7e49ded - Registering current configuration as safe fallback point 01:09:16,068 |-INFO in ch.qos.logback.classic.util.ContextInitializer@1e40fbb3 - ch.qos.logback.classic.util.DefaultJoranConfigurator.configure() call lasted 298 milliseconds. ExecutionStatus=DO_NOT_INVOKE_NEXT_IF_ANY lifecycle.test-compile: Deleting: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/_omero_build_420328800.tmp Deleting: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/OmeroJava-test.xml :: loading settings :: file = /home/omero/workspace/OMERO-test-integration/src/etc/ivysettings.xml :: delivering :: omero#OmeroJava-test;working@bdaf9f08f5d1 :: 5.6.3-513-75ed6e6d79-ice36-ice36 :: integration :: Thu Oct 24 01:09:12 UTC 2024 delivering ivy file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/OmeroJava-test.xml :: publishing :: omero#OmeroJava-test published OmeroJava-test to /home/omero/workspace/OMERO-test-integration/src/target/test-repository/OmeroJava-test-5.6.3-513-75ed6e6d79-ice36-ice36.jar published ivy to /home/omero/workspace/OMERO-test-integration/src/target/test-repository/OmeroJava-test-5.6.3-513-75ed6e6d79-ice36-ice36.xml integration: OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0 01:09:16,611 |-INFO in ch.qos.logback.classic.LoggerContext[default] - This is logback-classic version 1.3.14 01:09:16,615 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - No custom configurators were discovered as a service. 01:09:16,615 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - Trying to configure with ch.qos.logback.classic.joran.SerializedModelConfigurator 01:09:16,616 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - Constructed configurator of type class ch.qos.logback.classic.joran.SerializedModelConfigurator 01:09:16,617 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.scmo] 01:09:16,617 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.scmo] 01:09:16,618 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - ch.qos.logback.classic.joran.SerializedModelConfigurator.configure() call lasted 2 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 01:09:16,618 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - Trying to configure with ch.qos.logback.classic.util.DefaultJoranConfigurator 01:09:16,618 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - Constructed configurator of type class ch.qos.logback.classic.util.DefaultJoranConfigurator 01:09:16,619 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml] 01:09:16,620 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [logback.xml] at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes/logback.xml] 01:09:16,620 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@10e31a9a - Resource [logback.xml] occurs multiple times on the classpath. 01:09:16,620 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@10e31a9a - Resource [logback.xml] occurs at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes/logback.xml] 01:09:16,620 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@10e31a9a - Resource [logback.xml] occurs at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/resources/logback.xml] 01:09:16,822 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [stderr] 01:09:16,822 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 01:09:16,830 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 01:09:16,831 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@131774fe - As of version 1.2.0 "immediateFlush" property should be set within the enclosing Appender. 01:09:16,831 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@131774fe - Please move "immediateFlush" property into the enclosing appender. 01:09:16,858 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@131774fe - Setting the "immediateFlush" property of the enclosing appender to true 01:09:16,859 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [velocity] to ERROR 01:09:16,859 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org] to ERROR 01:09:16,859 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [ome] to ERROR 01:09:16,859 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [loci] to ERROR 01:09:16,859 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to WARN 01:09:16,859 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [stderr] to Logger[ROOT] 01:09:16,859 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@158d2680 - End of configuration. 01:09:16,860 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@77847718 - Registering current configuration as safe fallback point 01:09:16,860 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - ch.qos.logback.classic.util.DefaultJoranConfigurator.configure() call lasted 242 milliseconds. ExecutionStatus=DO_NOT_INVOKE_NEXT_IF_ANY Oct 24, 2024 1:09:18 AM ome.system.OmeroContext prepareRefresh INFO: Refreshing ome.system.OmeroContext@7f8633ae: startup date [Thu Oct 24 01:09:18 UTC 2024]; root of context hierarchy Oct 24, 2024 1:09:18 AM org.springframework.beans.factory.xml.XmlBeanDefinitionReader loadBeanDefinitions INFO: Loading XML bean definitions from class path resource [ome/config.xml] 2624ec83-2e4c-4fe8-957f-37364d44d6a0 2024-10-24 01:14:35,693 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:14:52,400 ERROR [ ome.formats.importer.ImportLibrary] (sDataset-1) Cannot link to target 2024-10-24 01:14:55,423 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:15:00,825 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:15:46,531 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:16:09,514 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:07,967 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:09,085 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:10,060 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:14,398 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:15,143 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:15,764 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:16,453 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:17,147 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:17,753 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:18,420 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:19,159 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:20,051 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:20,873 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:21,562 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:22,172 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:22,856 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:23,553 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:24,177 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:24,925 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:25,608 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:26,543 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:27,231 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:27,964 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:28,598 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:29,257 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:30,078 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:30,754 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:31,429 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:32,369 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:33,160 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:34,015 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:34,880 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:35,865 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:36,940 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:37,912 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:38,870 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:40,077 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:40,948 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:41,776 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:45,903 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:47,028 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:48,862 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:50,164 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:51,054 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:52,385 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:52,882 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:54,277 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:55,191 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:57,385 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:19:58,774 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:20:00,149 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:20:03,420 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:20:08,118 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:20:09,100 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:20:09,884 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:20:11,114 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:21:45,689 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:21:47,591 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:21:49,466 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:21:51,641 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:21:54,061 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:21:55,638 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:21:57,507 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:21:59,009 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:35,264 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:36,181 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:38,128 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:40,048 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:40,497 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:42,158 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:44,090 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:46,054 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:48,890 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:50,677 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:52,988 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:25:54,828 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:26:46,547 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:26:48,707 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:26:50,775 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:26:52,775 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:26:54,847 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:26:58,408 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:01,778 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:05,175 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:08,412 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:11,339 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:13,782 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:16,327 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:18,638 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:21,087 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:23,635 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:26,162 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:28,769 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:31,294 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:34,044 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:36,578 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:39,501 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:43,613 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:48,540 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:52,654 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:56,650 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:27:59,308 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:01,900 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:04,411 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:07,042 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:09,731 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:12,466 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:15,085 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:17,584 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:20,399 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:23,020 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:25,652 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:28,449 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:33,222 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:36,903 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:40,755 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:45,394 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:48,238 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:51,196 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:53,547 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:56,016 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:28:58,772 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:29:01,444 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:29:03,877 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:29:33,046 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:29:37,810 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:29:41,619 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:29:45,315 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:29:48,840 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:29:52,458 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:29:56,084 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:29:59,921 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:03,534 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:05,921 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:08,077 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:10,258 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:13,400 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:15,784 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:18,656 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:21,555 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:24,473 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:28,884 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:33,608 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:37,605 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:42,561 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:46,797 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:50,990 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:30:55,153 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:21,229 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:24,179 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:27,073 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:30,068 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:32,778 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:37,078 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:41,366 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:46,027 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:50,433 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:55,096 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:32:59,437 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:04,716 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:09,421 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:16,258 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:22,616 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:29,100 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:35,108 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:36,420 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:37,639 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:38,869 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:40,133 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:41,335 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:42,707 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:43,977 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:45,255 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:46,432 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:47,662 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:48,872 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:50,100 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:51,319 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:52,550 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:53,782 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:55,068 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:56,245 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:57,443 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:58,642 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:33:59,889 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:01,215 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:02,568 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:04,066 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:05,393 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:06,682 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:08,116 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:09,349 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:10,517 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:11,766 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:13,081 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:14,294 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:15,562 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:18,335 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:21,007 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:23,573 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:26,046 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:30,659 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:34,984 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:38,982 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:53,579 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:56,010 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:34:58,454 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:35:00,821 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:13,898 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:15,476 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:16,195 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:16,947 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:17,486 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:18,715 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:19,652 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:20,798 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:21,365 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:21,910 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:22,435 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:39:24,830 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:40:40,090 ERROR [ ome.system.UpgradeCheck] (Response-1) Error reading from url: http://upgrade.openmicroscopy.org.uk?version=test;os.name=Linux;os.arch=amd64;os.version=5.14.0-427.40.1.el9_4.x86_64;java.runtime.version=11.0.24%2B8-LTS;java.vm.vendor=Red+Hat%2C+Inc. "connect timed out" 2024-10-24 01:42:12,538 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:42:17,716 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:42:26,506 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:42:48,201 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:42:50,307 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:42:52,083 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:42:53,889 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:42:55,709 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:42:57,477 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:42:59,206 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:43:01,188 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:43:02,888 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:43:04,825 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:43:06,615 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:43:08,618 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:43:09,850 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:49:03,057 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:49:49,910 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 01:49:52,424 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted Could not login with command line arguments.Glacier2.PermissionDeniedException reason = "Password check failed for '023815ba-12ad-4aa1-976d-5786cc7d92df': [id=3014]" at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at java.base/java.lang.Class.newInstance(Class.java:584) at IceInternal.BasicStream.createUserException(BasicStream.java:2785) at IceInternal.BasicStream.access$300(BasicStream.java:14) at IceInternal.BasicStream$EncapsDecoder11.throwException(BasicStream.java:3620) at IceInternal.BasicStream.throwException(BasicStream.java:2291) at IceInternal.OutgoingAsync.throwUserException(OutgoingAsync.java:399) at Glacier2.RouterPrxHelper.end_createSession(RouterPrxHelper.java:178) at Glacier2.RouterPrxHelper.createSession(RouterPrxHelper.java:49) at Glacier2.RouterPrxHelper.createSession(RouterPrxHelper.java:41) at omero.client.createSession(client.java:776) at omero.client.createSession(client.java:719) at omero.gateway.Gateway.createSession(Gateway.java:1045) at omero.gateway.Gateway.connect(Gateway.java:297) at integration.gateway.GatewayUsageTest.testLoginFallback(GatewayUsageTest.java:154) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.invokers.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:135) at org.testng.internal.invokers.InvokeMethodRunnable.runOne(InvokeMethodRunnable.java:44) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:72) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:10) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Could not login with command line arguments.Glacier2.PermissionDeniedException reason = "Password check failed for '023815ba-12ad-4aa1-976d-5786cc7d92df': [id=3014]" at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at java.base/java.lang.Class.newInstance(Class.java:584) at IceInternal.BasicStream.createUserException(BasicStream.java:2785) at IceInternal.BasicStream.access$300(BasicStream.java:14) at IceInternal.BasicStream$EncapsDecoder11.throwException(BasicStream.java:3620) at IceInternal.BasicStream.throwException(BasicStream.java:2291) at IceInternal.OutgoingAsync.throwUserException(OutgoingAsync.java:399) at Glacier2.RouterPrxHelper.end_createSession(RouterPrxHelper.java:178) at Glacier2.RouterPrxHelper.createSession(RouterPrxHelper.java:49) at Glacier2.RouterPrxHelper.createSession(RouterPrxHelper.java:41) at omero.client.createSession(client.java:776) at omero.client.createSession(client.java:719) at omero.gateway.Gateway.createSession(Gateway.java:1045) at omero.gateway.Gateway.connect(Gateway.java:297) at integration.gateway.GatewayUsageTest.testLoginFallback(GatewayUsageTest.java:154) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.invokers.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:135) at org.testng.internal.invokers.InvokeMethodRunnable.runOne(InvokeMethodRunnable.java:44) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:72) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:10) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Exception in thread "TestNG-method=testLoginFallback-1" Support for FileAnnotationData is deprecated. Use OriginalFile instead. Could not load table dataomero.IllegalArgumentException: start value can't be greater than stop value at omero.gateway.facility.TablesFacility.query(TablesFacility.java:320) at integration.gateway.TablesFacilityTest.testInvalidParams(TablesFacilityTest.java:220) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.invokers.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:135) at org.testng.internal.invokers.InvokeMethodRunnable.runOne(InvokeMethodRunnable.java:44) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:72) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:10) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Could not load table dataomero.IllegalArgumentException: start value can't be greater than stop value at omero.gateway.facility.TablesFacility.query(TablesFacility.java:320) at integration.gateway.TablesFacilityTest.testInvalidParams(TablesFacilityTest.java:220) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.invokers.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:135) at org.testng.internal.invokers.InvokeMethodRunnable.runOne(InvokeMethodRunnable.java:44) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:72) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:10) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Exception in thread "TestNG-method=testInvalidParams-1" Could not load table dataomero.IllegalArgumentException: step value is greater than the specified range at omero.gateway.facility.TablesFacility.query(TablesFacility.java:324) at integration.gateway.TablesFacilityTest.testInvalidParams(TablesFacilityTest.java:229) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.invokers.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:135) at org.testng.internal.invokers.InvokeMethodRunnable.runOne(InvokeMethodRunnable.java:44) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:72) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:10) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Could not load table dataomero.IllegalArgumentException: step value is greater than the specified range at omero.gateway.facility.TablesFacility.query(TablesFacility.java:324) at integration.gateway.TablesFacilityTest.testInvalidParams(TablesFacilityTest.java:229) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.invokers.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:135) at org.testng.internal.invokers.InvokeMethodRunnable.runOne(InvokeMethodRunnable.java:44) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:72) at org.testng.internal.invokers.InvokeMethodRunnable.call(InvokeMethodRunnable.java:10) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Exception in thread "TestNG-method=testInvalidParams-1" No concrete type specified for column 'column1', using Object.toString() Request rows 241 to 319; columns [4, 8, 6, 1] Request rows 1823 to 1874; columns [9, 0, 5, 8, 2, 7, 10, 1] Request rows 87 to 159; columns [] Request rows 363 to 454; columns [6, 4, 2, 8, 5] Request rows 1013 to 1080; columns [2, 3, 10, 0, 5, 9, 7] 2024-10-24 02:04:35,872 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:01,894 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:06,620 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:10,768 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:15,066 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:18,736 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:22,309 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:25,771 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:29,227 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:32,693 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:34,507 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:36,288 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:38,172 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:40,732 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:43,838 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:47,089 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:50,050 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:52,301 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:54,626 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:56,979 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:06:59,463 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:01,871 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:03,286 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:04,727 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:06,217 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:08,501 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:12,636 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:16,926 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:20,967 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:25,122 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:27,242 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:28,880 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:32,523 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:35,892 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:39,458 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:42,817 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:44,694 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:46,426 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:07:48,357 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:08:25,479 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:08:29,950 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:08:34,441 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:08:52,892 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:09:35,326 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:09:46,179 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:09:47,484 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:09:50,037 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:09:51,374 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:09:52,610 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:09:54,734 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:09:56,066 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:09:57,243 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:09:58,508 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:01,257 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:02,107 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:03,401 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:04,252 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:05,617 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:06,458 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:07,688 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:08,501 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:09,990 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:10,834 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:12,065 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:12,862 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:14,074 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:14,884 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:16,066 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:16,981 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:18,415 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:19,247 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:20,699 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:21,535 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:23,018 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:23,940 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:25,281 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:26,169 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:27,589 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:28,414 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:29,839 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:30,703 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:32,103 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:32,932 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:34,395 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:35,224 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:39,645 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:40,526 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:47,015 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:47,887 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:55,385 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:10:56,249 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:02,575 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:03,493 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:06,859 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:07,686 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:10,985 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:11,811 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:23,333 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:24,180 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:30,632 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:31,427 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:34,848 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:35,702 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:39,066 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:39,916 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:46,343 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:47,167 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:50,624 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:51,423 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:55,035 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:11:55,892 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:03,422 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:04,253 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:10,942 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:11,861 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:19,345 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:20,255 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:26,717 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:27,562 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:30,813 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:31,658 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:34,931 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:35,799 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:43,169 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:44,068 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:50,467 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:51,301 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:54,804 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:55,648 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:59,050 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:12:59,902 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:07,291 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:08,074 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:14,608 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:15,467 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:18,906 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:19,787 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:27,260 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:28,147 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:34,555 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:35,365 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:38,697 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:39,515 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:42,841 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:43,701 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:51,209 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:52,017 ERROR [ o.formats.importer.util.ClientKeepAlive] (6-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:58,684 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:13:59,494 ERROR [ o.formats.importer.util.ClientKeepAlive] (9-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:14:03,075 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:14:03,950 ERROR [ o.formats.importer.util.ClientKeepAlive] (2-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:14:10,593 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:14:11,417 ERROR [ o.formats.importer.util.ClientKeepAlive] (5-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:14:14,995 ERROR [ o.formats.importer.util.ClientKeepAlive] (7-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:14:15,936 ERROR [ o.formats.importer.util.ClientKeepAlive] (8-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:14:23,534 ERROR [ o.formats.importer.util.ClientKeepAlive] (0-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:14:24,560 ERROR [ o.formats.importer.util.ClientKeepAlive] (1-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted =============================================== OmeroJava.integration Total tests run: 2589, Passes: 2588, Failures: 1, Skips: 0 =============================================== 2024-10-24 02:14:31,275 ERROR [ o.formats.importer.util.ClientKeepAlive] (3-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted 2024-10-24 02:14:32,174 ERROR [ o.formats.importer.util.ClientKeepAlive] (4-thread-1) Exception while executing ping(), logging Connector out: java.lang.RuntimeException: Ice.CommunicatorDestroyedException at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:845) at ome.formats.importer.util.ClientKeepAlive.run(ClientKeepAlive.java:77) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: Ice.CommunicatorDestroyedException: null at IceInternal.Instance.proxyFactory(Instance.java:239) at IceInternal.BasicStream.writeProxy(BasicStream.java:2142) at omero.api.ServiceInterfacePrxHelper.__write(ServiceInterfacePrxHelper.java:114) at omero.api.ServiceListHelper.write(ServiceListHelper.java:37) at omero.api.ServiceFactoryPrxHelper.begin_keepAllAlive(ServiceFactoryPrxHelper.java:5946) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5846) at omero.api.ServiceFactoryPrxHelper.keepAllAlive(ServiceFactoryPrxHelper.java:5833) at ome.formats.OMEROMetadataStoreClient.ping(OMEROMetadataStoreClient.java:838) ... 7 common frames omitted The tests failed. BUILD SUCCESSFUL Total time: 65 minutes 20 seconds + /home/omero/workspace/OMERO-test-integration/src/build.py -f components/tools/OmeroPy/build.xml integration -Dtestreports.dir=target/reports/integration OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0 Buildfile: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/build.xml Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy... python-integration: Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/target/reports/integration ============================= test session starts ============================== platform linux -- Python 3.9.18, pytest-8.3.3, pluggy-1.5.0 -- /home/omero/workspace/OMERO-test-integration/.venv3/bin/python3 cachedir: .pytest_cache django: version: 4.2.16, settings: omeroweb.settings (from ini) rootdir: /home/omero/workspace/OMERO-test-integration/src/components/tools configfile: pytest.ini plugins: xdist-3.6.1, mock-3.14.0, django-4.9.0 collecting ... collected 2061 items / 36 deselected / 2025 selected test/integration/clitest/test_admin.py::TestAdmin::test_checkupgrade0 FAILED [ 0%] test/integration/clitest/test_admin.py::TestAdmin::test_checkupgrade1 FAILED [ 0%] test/integration/clitest/test_admin.py::TestAdmin::test_log PASSED [ 0%] test/integration/clitest/test_admin.py::TestAdminRestrictedAdmin::test_log PASSED [ 0%] test/integration/clitest/test_admin.py::TestAdminRestrictedAdmin::test_checkupgrade0 FAILED [ 0%] test/integration/clitest/test_admin.py::TestAdminRestrictedAdmin::test_checkupgrade1 FAILED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rw-----Image] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rw-----Dataset] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rw-----Project] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rw-----Plate] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rw-----Screen] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwr----Image] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwr----Dataset] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwr----Project] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwr----Plate] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwr----Screen] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwra---Image] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwra---Dataset] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwra---Project] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwra---Plate] PASSED [ 0%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwra---Screen] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwrw---Image] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwrw---Dataset] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwrw---Project] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwrw---Plate] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[-rwrw---Screen] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rw-----Image] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rw-----Dataset] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rw-----Project] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rw-----Plate] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rw-----Screen] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwr----Image] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwr----Dataset] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwr----Project] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwr----Plate] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwr----Screen] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwra---Image] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwra---Dataset] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwra---Project] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwra---Plate] PASSED [ 1%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwra---Screen] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwrw---Image] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwrw---Dataset] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwrw---Project] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwrw---Plate] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[Group:-rwrw---Screen] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rw-----Image] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rw-----Dataset] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rw-----Project] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rw-----Plate] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rw-----Screen] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwr----Image] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwr----Dataset] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwr----Project] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwr----Plate] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwr----Screen] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwra---Image] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwra---Dataset] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwra---Project] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwra---Plate] PASSED [ 2%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwra---Screen] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwrw---Image] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwrw---Dataset] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwrw---Project] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwrw---Plate] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testChgrpMyData[ExperimenterGroup:-rwrw---Screen] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testNonMember PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testGroupName PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testFileset PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testFilesetPartialFailing PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testFilesetOneImage PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testFilesetAllImagesMoveImages PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testFilesetAllImagesMoveDataset PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testNonExistingGroupId[] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testNonExistingGroupId[Group:] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testNonExistingGroupId[ExperimenterGroup:] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testNonExistingGroupName PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsSameClass[True-1] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsSameClass[True-2] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsSameClass[True-3] PASSED [ 3%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsSameClass[False-1] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsSameClass[False-2] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsSameClass[False-3] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesSeparated[True-1] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesSeparated[True-2] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesSeparated[True-3] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesSeparated[False-1] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesSeparated[False-2] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesSeparated[False-3] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesInterlaced[True-1] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesInterlaced[True-2] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesInterlaced[True-3] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesInterlaced[False-1] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesInterlaced[False-2] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSimpleObjectsTwoClassesInterlaced[False-3] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testBasicSkipheadBothForms[True-1-/Dataset] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testBasicSkipheadBothForms[True-1-] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testBasicSkipheadBothForms[True-2-/Dataset] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testBasicSkipheadBothForms[True-2-] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testBasicSkipheadBothForms[False-1-/Dataset] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testBasicSkipheadBothForms[False-1-] PASSED [ 4%] test/integration/clitest/test_chgrp.py::TestChgrp::testBasicSkipheadBothForms[False-2-/Dataset] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testBasicSkipheadBothForms[False-2-] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsSameClass[True-1] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsSameClass[True-2] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsSameClass[True-3] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsSameClass[False-1] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsSameClass[False-2] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsSameClass[False-3] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsSeparated[True-1] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsSeparated[True-2] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsSeparated[True-3] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsSeparated[False-1] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsSeparated[False-2] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsSeparated[False-3] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsInterlaced[True-1] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsInterlaced[True-2] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsInterlaced[True-3] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsInterlaced[False-1] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsInterlaced[False-2] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrp::testMultipleSkipheadsPlusObjectsInterlaced[False-3] PASSED [ 5%] test/integration/clitest/test_chgrp.py::TestChgrpRoot::testNonMember PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[-Image] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[-Dataset] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[-Project] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[-Plate] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[-Screen] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[User:-Image] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[User:-Dataset] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[User:-Project] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[User:-Plate] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[User:-Screen] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[Experimenter:-Image] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[Experimenter:-Dataset] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[Experimenter:-Project] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[Experimenter:-Plate] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithId[Experimenter:-Screen] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageTargetUser PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithName[-] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithName[-/] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithName[I-] PASSED [ 6%] test/integration/clitest/test_chown.py::TestChown::testChownBasicUsageWithName[I-/] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testChownDifferentGroup PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testFileset[image-1] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testFileset[image-2] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testFileset[fileset-1] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testFileset[fileset-2] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testFilesetPartialFailing PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testFilesetAllImagesChownDataset PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsSameClass[True-1] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsSameClass[True-2] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsSameClass[True-3] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsSameClass[False-1] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsSameClass[False-2] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsSameClass[False-3] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesSeparated[True-1] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesSeparated[True-2] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesSeparated[True-3] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesSeparated[False-1] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesSeparated[False-2] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesSeparated[False-3] PASSED [ 7%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesInterlaced[True-1] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesInterlaced[True-2] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesInterlaced[True-3] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesInterlaced[False-1] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesInterlaced[False-2] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSimpleObjectsTwoClassesInterlaced[False-3] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testBasicSkipheadBothForms[True-1-/Dataset] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testBasicSkipheadBothForms[True-1-] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testBasicSkipheadBothForms[True-2-/Dataset] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testBasicSkipheadBothForms[True-2-] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testBasicSkipheadBothForms[False-1-/Dataset] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testBasicSkipheadBothForms[False-1-] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testBasicSkipheadBothForms[False-2-/Dataset] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testBasicSkipheadBothForms[False-2-] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsSeparated[True-1] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsSeparated[True-2] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsSeparated[True-3] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsSeparated[False-1] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsSeparated[False-2] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsSeparated[False-3] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsInterlaced[True-1] PASSED [ 8%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsInterlaced[True-2] PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsInterlaced[True-3] PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsInterlaced[False-1] PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsInterlaced[False-2] PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testMultipleSkipheadsPlusObjectsInterlaced[False-3] PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testDryRun PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testExcludeNone PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testExcludeDataset PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testExcludeImage PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testExcludeOverridesInclude PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testSeparateAnnotationTransfer PASSED [ 9%] test/integration/clitest/test_chown.py::TestChown::testOutputWithElision PASSED [ 9%] test/integration/clitest/test_chown.py::TestTagChown::testChownOneTagSetWithUniqueTags PASSED [ 9%] test/integration/clitest/test_chown.py::TestChownRoot::testChownBasicUsageWithId PASSED [ 9%] test/integration/clitest/test_chown.py::TestChownNonGroupOwner::testChownBasicUsageWithId PASSED [ 9%] test/integration/clitest/test_cleanse.py::TestCleanse::testCleanseAdminOnly PASSED [ 9%] test/integration/clitest/test_cleanse.py::TestCleanseFullAdmin::testCleanseBasic PASSED [ 9%] test/integration/clitest/test_cleanse.py::TestCleanseFullAdmin::testCleanseNonsenseName PASSED [ 9%] test/integration/clitest/test_cleanse.py::TestCleanseRestrictedAdmin::test_cleanse_restricted_admin PASSED [ 9%] test/integration/clitest/test_cleanse.py::TestFixPyramids::test_fixpyramids_admin_only PASSED [ 9%] test/integration/clitest/test_cleanse.py::TestFixPyramidsRestrictedAdmin::test_fixpyramids_restricted_admin PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testBadVersionDies PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPasswordIsAskedForAgainIfDiffer PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPasswordIsAskedForAgainIfEmpty PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[--] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[----no-salt] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[-0-] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[-0---no-salt] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[-1-] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[-1---no-salt] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[ome--] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[ome----no-salt] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[ome-0-] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[ome-0---no-salt] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[ome-1-] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testPassword[ome-1---no-salt] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testScript[--] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testScript[---f] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testScript[----file] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testScript[---no-salt-] PASSED [ 10%] test/integration/clitest/test_db.py::TestDatabase::testScript[---no-salt--f] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScript[---no-salt---file] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScript[--password ome--] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScript[--password ome---f] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScript[--password ome----file] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScript[--password ome---no-salt-] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScript[--password ome---no-salt--f] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScript[--password ome---no-salt---file] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[%s %s %s--] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[%s %s %s---f] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[%s %s %s----file] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[%s %s %s---no-salt-] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[%s %s %s---no-salt--f] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[%s %s %s---no-salt---file] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[--version %s --patch %s --password %s--] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[--version %s --patch %s --password %s---f] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[--version %s --patch %s --password %s----file] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[--version %s --patch %s --password %s---no-salt-] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[--version %s --patch %s --password %s---no-salt--f] PASSED [ 11%] test/integration/clitest/test_db.py::TestDatabase::testScriptDeveloperArgs[--version %s --patch %s --password %s---no-salt---file] PASSED [ 11%] test/integration/clitest/test_delete.py::TestDelete::testDeleteMyData[Image-] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testDeleteMyData[Image-I] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testDeleteMyData[Dataset-] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testDeleteMyData[Dataset-I] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testDeleteMyData[Project-] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testDeleteMyData[Project-I] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testDeleteMyData[Plate-] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testDeleteMyData[Plate-I] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testDeleteMyData[Screen-] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testDeleteMyData[Screen-I] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testFileset[image-1] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testFileset[image-2] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testFileset[fileset-1] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testFileset[fileset-2] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testFilesetPartialFailing PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testFilesetAllImagesDeleteDataset PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsSameClass[True-1] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsSameClass[True-2] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsSameClass[True-3] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsSameClass[False-1] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsSameClass[False-2] PASSED [ 12%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsSameClass[False-3] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesSeparated[True-1] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesSeparated[True-2] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesSeparated[True-3] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesSeparated[False-1] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesSeparated[False-2] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesSeparated[False-3] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesInterlaced[True-1] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesInterlaced[True-2] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesInterlaced[True-3] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesInterlaced[False-1] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesInterlaced[False-2] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSimpleObjectsTwoClassesInterlaced[False-3] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testBasicSkipheadBothForms[True-1-/Dataset] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testBasicSkipheadBothForms[True-1-] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testBasicSkipheadBothForms[True-2-/Dataset] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testBasicSkipheadBothForms[True-2-] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testBasicSkipheadBothForms[False-1-/Dataset] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testBasicSkipheadBothForms[False-1-] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testBasicSkipheadBothForms[False-2-/Dataset] PASSED [ 13%] test/integration/clitest/test_delete.py::TestDelete::testBasicSkipheadBothForms[False-2-] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsSameClass[True-1] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsSameClass[True-2] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsSameClass[True-3] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsSameClass[False-1] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsSameClass[False-2] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsSameClass[False-3] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsSeparated[True-1] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsSeparated[True-2] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsSeparated[True-3] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsSeparated[False-1] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsSeparated[False-2] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsSeparated[False-3] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsInterlaced[True-1] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsInterlaced[True-2] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsInterlaced[True-3] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsInterlaced[False-1] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsInterlaced[False-2] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testMultipleSkipheadsPlusObjectsInterlaced[False-3] PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testDryRun PASSED [ 14%] test/integration/clitest/test_delete.py::TestDelete::testExcludeNone PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testExcludeDataset PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testExcludeImage PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testExcludeOverridesInclude PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testDefaultExclusion PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testDefaultExclusionOverride PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testDefaultExclusionPartialOverride PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testSeparateAnnotationDelete PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testLinkedAnnotationDelete PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testLinkedAnnotationDeleteWithOverride PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testInputWithElisionDefault[1] PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testInputWithElisionDefault[2] PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testInputWithElisionDefault[3] PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testInputWithElisionForce PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testElisionDefaultFailForce PASSED [ 15%] test/integration/clitest/test_delete.py::TestDelete::testOutputWithElision PASSED [ 15%] test/integration/clitest/test_delete.py::TestTagDelete::testDeleteOneTagSetNotTags PASSED [ 15%] test/integration/clitest/test_delete.py::TestTagDelete::testDeleteTwoTagSetsNotTags PASSED [ 15%] test/integration/clitest/test_delete.py::TestTagDelete::testDeleteOneTagSetIncludingTags PASSED [ 15%] test/integration/clitest/test_delete.py::TestTagDelete::testDeleteTwoTagSetsIncludingTags PASSED [ 15%] test/integration/clitest/test_download.py::TestDownload::testNonExistingOriginalFile[] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testNonExistingOriginalFile[OriginalFile:] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testOriginalFileTmpfile[] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testOriginalFileTmpfile[OriginalFile:] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testOriginalFileStdout[] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testOriginalFileStdout[OriginalFile:] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testOriginalFileMultipleGroups[] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testOriginalFileMultipleGroups[OriginalFile:] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testNonExistingFileAnnotation PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testFileAnnotationTmpfile PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testFileAnnotationStdout PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testFileAnnotationMultipleGroups PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testNonExistingImage PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testImageFileset[True] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testImageFileset[False] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testSingleImageWithCompanion PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testMIF PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testImageNoFileset PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testImageMultipleGroups PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testValidPolicy PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testPolicyGlobalRestriction[+read,+write,+image] PASSED [ 16%] test/integration/clitest/test_download.py::TestDownload::testPolicyGlobalRestriction[-read,-write,-image,-plate] SKIPPED [ 17%] test/integration/clitest/test_download.py::TestDownload::testPolicyGlobalRestriction[-read,+write,+image,-plate] SKIPPED [ 17%] test/integration/clitest/test_download.py::TestDownload::testPolicyGlobalRestriction[+read,+write,+image,-plate] SKIPPED [ 17%] test/integration/clitest/test_download.py::TestDownload::testPolicyGlobalRestriction[+read,+write,+image,+plate] SKIPPED [ 17%] test/integration/clitest/test_download.py::TestDownload::testPolicyGroupRestriction[+read,+write,+image] PASSED [ 17%] test/integration/clitest/test_download.py::TestDownload::testPolicyGroupRestriction[-read,-write,-image,-plate] PASSED [ 17%] test/integration/clitest/test_download.py::TestDownload::testPolicyGroupRestriction[-read,+write,+image,-plate] PASSED [ 17%] test/integration/clitest/test_download.py::TestDownload::testPolicyGroupRestriction[+read,+write,+image,-plate] PASSED [ 17%] test/integration/clitest/test_download.py::TestDownload::testPolicyGroupRestriction[+read,+write,+image,+plate] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObject[Image-] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObject[Image-I] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObject[Dataset-] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObject[Dataset-I] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObject[Project-] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObject[Project-I] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObject[Plate-] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObject[Plate-I] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObject[Screen-] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObject[Screen-I] PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObjectDryRun PASSED [ 17%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObjectReport PASSED [ 18%] test/integration/clitest/test_duplicate.py::TestDuplicate::testDuplicateSingleObjectDryRunReport PASSED [ 18%] test/integration/clitest/test_duplicate.py::TestDuplicate::testBasicHierarchyDuplication PASSED [ 18%] test/integration/clitest/test_duplicate.py::TestDuplicate::testIgnoreLinks PASSED [ 18%] test/integration/clitest/test_duplicate.py::TestDuplicate::testIgnoreLinksOverridden PASSED [ 18%] test/integration/clitest/test_duplicate.py::TestDuplicate::testReferencing PASSED [ 18%] test/integration/clitest/test_duplicate.py::TestDuplicate::testSkipheadDuplication PASSED [ 18%] test/integration/clitest/test_fs.py::TestFS::testRepos PASSED [ 18%] test/integration/clitest/test_fs.py::TestFS::testSetsWithTransfer PASSED [ 18%] test/integration/clitest/test_fs.py::TestFS::testSetsAdminOnly PASSED [ 18%] test/integration/clitest/test_fs.py::TestFS::testMkdirAdminOnly PASSED [ 18%] test/integration/clitest/test_fs.py::TestFS::testImportTime[all-True] PASSED [ 18%] test/integration/clitest/test_fs.py::TestFS::testImportTime[all-False] PASSED [ 18%] test/integration/clitest/test_fs.py::TestFS::testImportTime[minmax-True] PASSED [ 18%] test/integration/clitest/test_fs.py::TestFS::testImportTime[minmax-False] PASSED [ 18%] test/integration/clitest/test_fs.py::TestFS::testImportTime[thumbnails-True] PASSED [ 18%] test/integration/clitest/test_fs.py::TestFS::testImportTime[thumbnails-False] PASSED [ 18%] test/integration/clitest/test_fs.py::TestFsRoot::testMkdirAsAdminSimpleDirCreation PASSED [ 18%] test/integration/clitest/test_fs.py::TestFsRoot::testMkdirAsAdminHierarchyOnlyPreexisting PASSED [ 18%] test/integration/clitest/test_fs.py::TestFsRoot::testMkdirAsAdminHierarchyParents PASSED [ 18%] test/integration/clitest/test_group.py::TestGroup::testList[None-None] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testList[None-id] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testList[None-name] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testList[count-None] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testList[count-id] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testList[count-name] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testList[long-None] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testList[long-id] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testList[long-name] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testAddAdminOnly PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testInfoNoArgument PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testInfoArgument[id] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testInfoArgument[name] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testInfoArgument[--group-id] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testInfoArgument[--group-name] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testInfoInvalidGroup PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testListUsersNoArgument PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testListUsersArgument[id] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testListUsersArgument[name] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testListUsersArgument[--group-id] PASSED [ 19%] test/integration/clitest/test_group.py::TestGroup::testListUsersArgument[--group-name] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroup::testListUsersInvalidArgument PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddDefaults PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddPerms[--perms-rw----] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddPerms[--perms-rwr---] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddPerms[--perms-rwra--] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddPerms[--perms-rwrw--] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddPerms[--type-private] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddPerms[--type-read-only] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddPerms[--type-read-annotate] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddPerms[--type-read-write] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddSameNamefails PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testAddIgnoreExisting PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rw-----rw-------id] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rw-----rw-------name] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rw-----rwr------id] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rw-----rwr------name] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rw-----rwra-----id] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rw-----rwra-----name] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rw-----rwrw-----id] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rw-----rwrw-----name] PASSED [ 20%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwr----rw-------id] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwr----rw-------name] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwr----rwr------id] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwr----rwr------name] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwr----rwra-----id] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwr----rwra-----name] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwr----rwrw-----id] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwr----rwrw-----name] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwra---rw-------id] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwra---rw-------name] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwra---rwr------id] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwra---rwr------name] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwra---rwra-----id] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwra---rwra-----name] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwra---rwrw-----id] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwra---rwrw-----name] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwrw---rw-------id] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwrw---rw-------name] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwrw---rwr------id] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwrw---rwr------name] PASSED [ 21%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwrw---rwra-----id] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwrw---rwra-----name] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwrw---rwrw-----id] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--perms-rwrw---rwrw-----name] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-private-rw-------id] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-private-rw-------name] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-private-rwr------id] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-private-rwr------name] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-private-rwra-----id] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-private-rwra-----name] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-private-rwrw-----id] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-private-rwrw-----name] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-only-rw-------id] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-only-rw-------name] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-only-rwr------id] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-only-rwr------name] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-only-rwra-----id] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-only-rwra-----name] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-only-rwrw-----id] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-only-rwrw-----name] PASSED [ 22%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-annotate-rw-------id] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-annotate-rw-------name] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-annotate-rwr------id] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-annotate-rwr------name] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-annotate-rwra-----id] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-annotate-rwra-----name] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-annotate-rwrw-----id] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-annotate-rwrw-----name] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-write-rw-------id] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-write-rw-------name] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-write-rwr------id] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-write-rwr------name] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-write-rwra-----id] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-write-rwra-----name] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-write-rwrw-----id] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testPerms[--type-read-write-rwrw-----name] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[None-id---id] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[None-id---name] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[None-omeName---id] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[None-omeName---name] PASSED [ 23%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[None---user-id---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[None---user-id---name] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[None---user-name---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[None---user-name---name] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[--as-owner-id---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[--as-owner-id---name] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[--as-owner-omeName---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[--as-owner-omeName---name] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[--as-owner---user-id---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[--as-owner---user-id---name] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[--as-owner---user-name---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testAddUser[--as-owner---user-name---name] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-True-id---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-True-id---name] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-True-omeName---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-True-omeName---name] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-True---user-id---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-True---user-id---name] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-True---user-name---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-True---user-name---name] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-False-id---id] PASSED [ 24%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-False-id---name] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-False-omeName---id] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-False-omeName---name] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-False---user-id---id] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-False---user-id---name] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-False---user-name---id] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[None-False---user-name---name] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-True-id---id] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-True-id---name] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-True-omeName---id] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-True-omeName---name] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-True---user-id---id] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-True---user-id---name] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-True---user-name---id] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-True---user-name---name] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-False-id---id] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-False-id---name] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-False-omeName---id] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-False-omeName---name] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-False---user-id---id] PASSED [ 25%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-False---user-id---name] PASSED [ 26%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-False---user-name---id] PASSED [ 26%] test/integration/clitest/test_group.py::TestGroupRoot::testRemoveUser[--as-owner-False---user-name---name] PASSED [ 26%] test/integration/clitest/test_group.py::TestGroupRoot::testCopyUsers[None-id-id] PASSED [ 26%] test/integration/clitest/test_group.py::TestGroupRoot::testCopyUsers[None-id-name] PASSED [ 26%] test/integration/clitest/test_group.py::TestGroupRoot::testCopyUsers[None-name-id] PASSED [ 26%] test/integration/clitest/test_group.py::TestGroupRoot::testCopyUsers[None-name-name] PASSED [ 26%] test/integration/clitest/test_group.py::TestGroupRoot::testCopyUsers[--as-owner-id-id] PASSED [ 26%] test/integration/clitest/test_group.py::TestGroupRoot::testCopyUsers[--as-owner-id-name] PASSED [ 26%] test/integration/clitest/test_group.py::TestGroupRoot::testCopyUsers[--as-owner-name-id] PASSED [ 26%] test/integration/clitest/test_group.py::TestGroupRoot::testCopyUsers[--as-owner-name-name] PASSED [ 26%] test/integration/clitest/test_import.py::TestImport::testAutoClose PASSED [ 26%] test/integration/clitest/test_import.py::TestImport::testChecksumAlgorithm[Adler-32-legacy] PASSED [ 26%] test/integration/clitest/test_import.py::TestImport::testChecksumAlgorithm[Adler-32-underscore] PASSED [ 26%] test/integration/clitest/test_import.py::TestImport::testChecksumAlgorithm[CRC-32-underscore] PASSED [ 26%] test/integration/clitest/test_import.py::TestImport::testChecksumAlgorithm[File-Size-64-underscore] PASSED [ 26%] test/integration/clitest/test_import.py::TestImport::testChecksumAlgorithm[MD5-128-underscore] PASSED [ 26%] test/integration/clitest/test_import.py::TestImport::testChecksumAlgorithm[Murmur3-32-underscore] PASSED [ 26%] test/integration/clitest/test_import.py::TestImport::testChecksumAlgorithm[Murmur3-128-underscore] PASSED [ 26%] test/integration/clitest/test_import.py::TestImport::testChecksumAlgorithm[SHA1-160-underscore] PASSED [ 26%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Python-1-Official-True] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Python-1-Official-False] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Python-1-Deprecated-True] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Python-1-Deprecated-False] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Java-1-Official-True] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Java-1-Official-False] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Java-1-Deprecated-True] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Java-1-Deprecated-False] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Java-2-Official-True] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Java-2-Official-False] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Java-2-Deprecated-True] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText[Java-2-Deprecated-False] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText_one_ns[Python-1-Official] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText_one_ns[Python-1-Deprecated] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText_one_ns[Java-1-Official] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText_one_ns[Java-1-Deprecated] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText_one_ns[Java-2-Official] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationText_one_ns[Java-2-Deprecated] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationLink[Python-1-Official] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationLink[Python-1-Deprecated] PASSED [ 27%] test/integration/clitest/test_import.py::TestImport::testAnnotationLink[Java-1-Official] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testAnnotationLink[Java-1-Deprecated] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testAnnotationLink[Java-2-Official] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testAnnotationLink[Java-2-Deprecated] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source0-True] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source0-False] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source1-True] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source1-False] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source2-True] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source2-False] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source3-True] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source3-False] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source4-True] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source4-False] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source5-True] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testTargetArgument[source5-False] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameModelTargetArgument[-True] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameModelTargetArgument[-False] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameModelTargetArgument[+-True] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameModelTargetArgument[+-False] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameModelTargetArgument[--True] PASSED [ 28%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameModelTargetArgument[--False] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameModelTargetArgument[%-True] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameModelTargetArgument[%-False] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameModelTargetArgument[@-True] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameModelTargetArgument[@-False] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameTemplateTargetArgument[-True] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameTemplateTargetArgument[-False] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameTemplateTargetArgument[+-True] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameTemplateTargetArgument[+-False] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameTemplateTargetArgument[--True] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameTemplateTargetArgument[--False] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameTemplateTargetArgument[%-True] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameTemplateTargetArgument[%-False] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameTemplateTargetArgument[@-True] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testQualifiedNameTemplateTargetArgument[@-False] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testMultipleNameModelTargets[-True] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testMultipleNameModelTargets[-False] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testMultipleNameModelTargets[+-True] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testMultipleNameModelTargets[+-False] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testMultipleNameModelTargets[--True] PASSED [ 29%] test/integration/clitest/test_import.py::TestImport::testMultipleNameModelTargets[--False] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testUniqueMultipleNameModelTargets[True] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testUniqueMultipleNameModelTargets[False] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testNestedNameTemplateTargetArgument[True] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testNestedNameTemplateTargetArgument[False] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testMultipleNameTemplateTargetArgument[-True] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testMultipleNameTemplateTargetArgument[-False] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testMultipleNameTemplateTargetArgument[+-True] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testMultipleNameTemplateTargetArgument[+-False] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testMultipleNameTemplateTargetArgument[--True] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testMultipleNameTemplateTargetArgument[--False] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testMultipleNameTemplateTargetArgument[@-True] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testMultipleNameTemplateTargetArgument[@-False] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testUniqueMultipleNameTemplateTargetArgument[True] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testUniqueMultipleNameTemplateTargetArgument[False] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testImportLinkableTarget[False-False] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testImportLinkableTarget[False-True] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testImportLinkableTarget[True-False] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testImportLinkableTarget[True-True] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testBadTargetArgument[Project] PASSED [ 30%] test/integration/clitest/test_import.py::TestImport::testBadTargetArgument[Plate] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testBadTargetArgument[Image] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testBadModelTargetDiscriminator[Dataset] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testBadModelTargetDiscriminator[Screen] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[None-ALL] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[None-TRACE] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[None-DEBUG] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[None-INFO] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[None-WARN] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[None-ERROR] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[---ALL] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[---TRACE] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[---DEBUG] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[---INFO] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[---WARN] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testDebugArgument[---ERROR] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testImportOutputDefault PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testImportOutputYaml PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testImportOutputLegacy PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testImportOutputDefaultWithScreen[1] PASSED [ 31%] test/integration/clitest/test_import.py::TestImport::testImportOutputDefaultWithScreen[2] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testImportOutputDefaultWithScreen[3] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testImportAsRoot PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testImportMultiGroup PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testImportAsRootMultiGroup PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Image] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Image-x] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Image--description] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Image-n] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Image-n-x] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Image-n--description] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Image--name] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Image--name-x] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Image--name--description] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate-x] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate--description] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate--plate_description] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate-n] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate-n-x] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate-n--description] PASSED [ 32%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate-n--plate_description] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate--name] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate--name-x] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate--name--description] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate--name--plate_description] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate--plate_name] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate--plate_name-x] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate--plate_name--description] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testNamingArguments[Plate--plate_name--plate_description] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testNoThumbnails[--no-thumbnails] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testNoThumbnails[--no_thumbnails] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testSkipArguments[] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testSkipArguments[all] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testSkipArguments[checksum] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testSkipArguments[minmax] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testSkipArguments[thumbnails] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testSkipArguments[upgrade] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testSymlinkImport PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testUnknownTarget[Dataset-test.fake--d] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testUnknownTarget[Screen-SPW&plates=1&plateRows=1&plateCols=1&fields=1&plateAcqs=1.fake--r] PASSED [ 33%] test/integration/clitest/test_import.py::TestImport::testEncryption[] PASSED [ 34%] test/integration/clitest/test_import.py::TestImport::testEncryption[--encrypted=false] PASSED [ 34%] test/integration/clitest/test_import.py::TestImport::testEncryption[--encrypted=true] PASSED [ 34%] test/integration/clitest/test_import.py::TestImport::testBulk PASSED [ 34%] test/integration/clitest/test_import.py::TestImport::testParallelUpload PASSED [ 34%] test/integration/clitest/test_import.py::TestImport::testParallelFileset PASSED [ 34%] test/integration/clitest/test_import.py::TestImport::testBulkImportLogs PASSED [ 34%] test/integration/clitest/test_import_bulk.py::TestImportBulk::testBulk PASSED [ 34%] test/integration/clitest/test_ldap.py::TestLDAP::testAdminOnly[active] PASSED [ 34%] test/integration/clitest/test_ldap.py::TestLDAP::testAdminOnly[discover] PASSED [ 34%] test/integration/clitest/test_ldap.py::TestLDAP::testAdminOnly[create] PASSED [ 34%] test/integration/clitest/test_ldap.py::TestLDAP::testAdminOnly[getdn] PASSED [ 34%] test/integration/clitest/test_ldap.py::TestLDAP::testAdminOnly[setdn] PASSED [ 34%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_identifiers PASSED [ 34%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_parents[False] PASSED [ 34%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_parents[True] PASSED [ 34%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_roi_count[0] PASSED [ 34%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_roi_count[1] PASSED [ 34%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_original PASSED [ 34%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_bulkanns[False] PASSED [ 34%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_bulkanns[True] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_measures[False] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_measures[True] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_allanns[False] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadata::test_get_allanns[True] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadataControl::test_original PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadataControl::test_bulkanns[False] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadataControl::test_bulkanns[True] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadataControl::test_measures[False] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadataControl::test_measures[True] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadataControl::test_mapanns[False] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadataControl::test_mapanns[True] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadataControl::test_allanns[False] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadataControl::test_allanns[True] PASSED [ 35%] test/integration/clitest/test_metadata.py::TestMetadataControl::test_pixelsize PASSED [ 35%] test/integration/clitest/test_obj.py::TestObj::test_create_from_file PASSED [ 35%] test/integration/clitest/test_obj.py::TestObj::test_create_from_args PASSED [ 35%] test/integration/clitest/test_obj.py::TestObj::test_linkage PASSED [ 35%] test/integration/clitest/test_obj.py::TestObj::test_linkage_via_variables PASSED [ 35%] test/integration/clitest/test_obj.py::TestObj::test_required[input0] PASSED [ 35%] test/integration/clitest/test_obj.py::TestObj::test_required[input1] PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_required[input2] PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_link_annotation PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_new_get_and_update PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_fail_leading_numbers_argument PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_argument_with_letters_and_numbers PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_new_and_get_obj PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_get_unit_and_value PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_get_unknown_and_empty_field PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_get_fields PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_get_list_field PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_list_get PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_map_mods PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_nulling PASSED [ 36%] test/integration/clitest/test_obj.py::TestObj::test_newlines PASSED [ 36%] test/integration/clitest/test_pyramids.py::TestRemovePyramids::test_removepyramids_admin_only PASSED [ 36%] test/integration/clitest/test_pyramids.py::TestRemovePyramidsRestrictedAdmin::test_removepyramids_restricted_admin PASSED [ 36%] test/integration/clitest/test_pyramids.py::TestRemovePyramidsFullAdmin::test_remove_pyramids_little_endian PASSED [ 36%] test/integration/clitest/test_pyramids.py::TestRemovePyramidsFullAdmin::test_remove_pyramids_imported_after_future PASSED [ 36%] test/integration/clitest/test_pyramids.py::TestRemovePyramidsFullAdmin::test_remove_pyramids_limit PASSED [ 36%] test/integration/clitest/test_pyramids.py::TestRemovePyramidsFullAdmin::test_remove_pyramids_not_valid_limit PASSED [ 36%] test/integration/clitest/test_pyramids.py::TestRemovePyramidsFullAdmin::test_remove_pyramids_manual SKIPPED [ 37%] test/integration/clitest/test_pyramids.py::TestRemovePyramidsFullAdmin::test_remove__pre_fs_pyramids PASSED [ 37%] test/integration/clitest/test_pyramids.py::TestRemovePyramidsFullAdmin::test_remove_pyramids_big_endian PASSED [ 37%] test/integration/clitest/test_pyramids.py::TestRemovePyramidsFullAdmin::test_remove_pyramids PASSED [ 37%] test/integration/clitest/test_pyramids.py::TestRemovePyramidsFullAdmin::test_remove_pyramids_check_thumbnails PASSED [ 37%] test/integration/clitest/test_script.py::TestScript::testList PASSED [ 37%] test/integration/clitest/test_script.py::TestScript::testDemo PASSED [ 37%] test/integration/clitest/test_script.py::TestScript::testFullSession PASSED [ 37%] test/integration/clitest/test_script.py::TestScript::testReplace PASSED [ 37%] test/integration/clitest/test_script.py::TestScript::testReplaceOfficial PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_basic PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_wildcard PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_name_field PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_description_field PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_style PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_ids_only PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_acquisition_date[data0] PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_acquisition_date[data1] PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_other_dates[data0] PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_other_dates[data1] PASSED [ 37%] test/integration/clitest/test_search.py::TestSearch::test_search_other_dates[data2] PASSED [ 38%] test/integration/clitest/test_search.py::TestSearch::test_search_other_dates[data3] PASSED [ 38%] test/integration/clitest/test_search.py::TestSearch::test_search_other_dates[data4] PASSED [ 38%] test/integration/clitest/test_search.py::TestSearch::test_search_no_parse PASSED [ 38%] test/integration/clitest/test_search.py::TestSearch::test_search_dataset_acquisition PASSED [ 38%] test/integration/clitest/test_search.py::TestSearch::test_search_index_by_user PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLogin[None-True] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLogin[None-False] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLogin[300-True] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLogin[300-False] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLoginAs[rw----] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLoginAs[rwr---] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLoginAs[rwra--] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLoginAs[rwrw--] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLoginMultiGroup[True-True] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLoginMultiGroup[True-False] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLoginMultiGroup[False-True] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testLoginMultiGroup[False-False] PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testGroup PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testTimeout PASSED [ 38%] test/integration/clitest/test_sessions.py::TestSessions::testFile PASSED [ 39%] test/integration/clitest/test_sessions.py::TestSessions::testKey PASSED [ 39%] test/integration/clitest/test_sessions.py::TestSessions::testWho[user] PASSED [ 39%] test/integration/clitest/test_sessions.py::TestSessions::testWho[root] PASSED [ 39%] test/integration/clitest/test_sessions.py::TestSessions::test_open PASSED [ 39%] test/integration/clitest/test_sessions.py::TestSessions::test_open_with_id PASSED [ 39%] test/integration/clitest/test_sessions.py::TestSessions::test_open_restricted_admin_no_sudo PASSED [ 39%] test/integration/clitest/test_sessions.py::TestSessions::test_open_restricted_admin_sudo PASSED [ 39%] test/integration/clitest/test_sessions.py::TestSessions::test_close PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testCreateTag[None-None] PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testCreateTag[None---name] PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testCreateTag[--description-None] PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testCreateTag[--description---name] PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testCreateTagset[None-None] PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testCreateTagset[None---name] PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testCreateTagset[--desc-None] PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testCreateTagset[--desc---name] PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testLoadTag PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testLoadTagset PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testLink[Image] PASSED [ 39%] test/integration/clitest/test_tag.py::TestTag::testLink[Dataset] PASSED [ 40%] test/integration/clitest/test_tag.py::TestTag::testLink[Project] PASSED [ 40%] test/integration/clitest/test_tag.py::TestTag::testLink[Screen] PASSED [ 40%] test/integration/clitest/test_tag.py::TestTag::testLink[Plate] PASSED [ 40%] test/integration/clitest/test_tag.py::TestTag::testLinkInvalidObject PASSED [ 40%] test/integration/clitest/test_tag.py::TestTag::testLinkInvalidTag PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rw-----admin-admin] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rw-----admin-owner] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rw-----admin-member] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rw-----owner-admin] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rw-----owner-owner] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rw-----owner-member] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rw-----member-admin] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rw-----member-owner] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rw-----member-member] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwr----admin-admin] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwr----admin-owner] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwr----admin-member] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwr----owner-admin] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwr----owner-owner] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwr----owner-member] PASSED [ 40%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwr----member-admin] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwr----member-owner] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwr----member-member] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwra---admin-admin] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwra---admin-owner] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwra---admin-member] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwra---owner-admin] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwra---owner-owner] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwra---owner-member] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwra---member-admin] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwra---member-owner] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwra---member-member] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwrw---admin-admin] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwrw---admin-owner] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwrw---admin-member] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwrw---owner-admin] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwrw---owner-owner] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwrw---owner-member] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwrw---member-admin] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwrw---member-owner] PASSED [ 41%] test/integration/clitest/test_tag.py::TestPermissions::testLink[rwrw---member-member] PASSED [ 42%] test/integration/clitest/test_tag.py::TestTagList::testList[] PASSED [ 42%] test/integration/clitest/test_tag.py::TestTagList::testList[--nopage] PASSED [ 42%] test/integration/clitest/test_tag.py::TestTagList::testListSets[] PASSED [ 42%] test/integration/clitest/test_tag.py::TestTagList::testListSets[--nopage] PASSED [ 42%] test/integration/clitest/test_upload.py::TestUpload::testUploadSingleFile PASSED [ 42%] test/integration/clitest/test_upload.py::TestUpload::testUploadMultipleFiles PASSED [ 42%] test/integration/clitest/test_upload.py::TestUpload::testUploadBadFile PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[None-None] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[None-id] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[None-login] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[None-first-name] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[None-last-name] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[None-email] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[count-None] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[count-id] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[count-login] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[count-first-name] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[count-last-name] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[count-email] PASSED [ 42%] test/integration/clitest/test_user.py::TestUser::testList[long-None] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testList[long-id] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testList[long-login] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testList[long-first-name] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testList[long-last-name] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testList[long-email] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testListWithStyles[None] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testListWithStyles[sql] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testListWithStyles[csv] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testListWithStyles[plain] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testListWithStyles[json] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testInfoNoArgument PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testInfoArgument[id] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testInfoArgument[omeName] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testInfoArgument[--user-id] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testInfoArgument[--user-name] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testInfoInvalidUser PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testListGroupsNoArgument PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testListGroupsArgument[id] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testListGroupsArgument[omeName] PASSED [ 43%] test/integration/clitest/test_user.py::TestUser::testListGroupsArgument[--user-id] PASSED [ 44%] test/integration/clitest/test_user.py::TestUser::testListGroupsArgument[--user-name] PASSED [ 44%] test/integration/clitest/test_user.py::TestUser::testListGroupsInvalidArgument PASSED [ 44%] test/integration/clitest/test_user.py::TestUser::testEmail[None] PASSED [ 44%] test/integration/clitest/test_user.py::TestUser::testEmail[-1] PASSED [ 44%] test/integration/clitest/test_user.py::TestUser::testEmail[--one] PASSED [ 44%] test/integration/clitest/test_user.py::TestUser::testPassword[True] PASSED [ 44%] test/integration/clitest/test_user.py::TestUser::testPassword[False] PASSED [ 44%] test/integration/clitest/test_user.py::TestUser::testAddAdminOnly PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[None-id---id] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[None-id---name] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[None-name---id] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[None-name---name] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[None---group-id---id] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[None---group-id---name] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[None---group-name---id] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[None---group-name---name] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[--as-owner-id---id] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[--as-owner-id---name] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[--as-owner-name---id] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[--as-owner-name---name] PASSED [ 44%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[--as-owner---group-id---id] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[--as-owner---group-id---name] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[--as-owner---group-name---id] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testJoinGroup[--as-owner---group-name---name] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-True-id---id] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-True-id---name] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-True-name---id] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-True-name---name] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-True---group-id---id] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-True---group-id---name] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-True---group-name---id] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-True---group-name---name] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-False-id---id] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-False-id---name] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-False-name---id] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-False-name---name] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-False---group-id---id] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-False---group-id---name] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-False---group-name---id] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[None-False---group-name---name] PASSED [ 45%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-True-id---id] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-True-id---name] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-True-name---id] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-True-name---name] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-True---group-id---id] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-True---group-id---name] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-True---group-name---id] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-True---group-name---name] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-False-id---id] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-False-id---name] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-False-name---id] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-False-name---name] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-False---group-id---id] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-False---group-id---name] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-False---group-name---id] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testLeaveGroup[--as-owner-False---group-name---name] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None-None-None-None] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None-None-None--m] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None-None-None---middlename] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None-None--e-None] PASSED [ 46%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None-None--e--m] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None-None--e---middlename] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None-None---email-None] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None-None---email--m] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None-None---email---middlename] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None--i-None-None] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None--i-None--m] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None--i-None---middlename] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None--i--e-None] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None--i--e--m] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None--i--e---middlename] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None--i---email-None] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None--i---email--m] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None--i---email---middlename] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None---institution-None-None] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None---institution-None--m] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None---institution-None---middlename] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None---institution--e-None] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None---institution--e--m] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None---institution--e---middlename] PASSED [ 47%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None---institution---email-None] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None---institution---email--m] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[None---institution---email---middlename] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a-None-None-None] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a-None-None--m] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a-None-None---middlename] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a-None--e-None] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a-None--e--m] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a-None--e---middlename] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a-None---email-None] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a-None---email--m] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a-None---email---middlename] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a--i-None-None] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a--i-None--m] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a--i-None---middlename] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a--i--e-None] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a--i--e--m] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a--i--e---middlename] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a--i---email-None] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a--i---email--m] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a--i---email---middlename] PASSED [ 48%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a---institution-None-None] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a---institution-None--m] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a---institution-None---middlename] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a---institution--e-None] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a---institution--e--m] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a---institution--e---middlename] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a---institution---email-None] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a---institution---email--m] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[-a---institution---email---middlename] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin-None-None-None] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin-None-None--m] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin-None-None---middlename] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin-None--e-None] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin-None--e--m] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin-None--e---middlename] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin-None---email-None] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin-None---email--m] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin-None---email---middlename] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin--i-None-None] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin--i-None--m] PASSED [ 49%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin--i-None---middlename] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin--i--e-None] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin--i--e--m] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin--i--e---middlename] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin--i---email-None] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin--i---email--m] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin--i---email---middlename] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin---institution-None-None] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin---institution-None--m] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin---institution-None---middlename] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin---institution--e-None] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin---institution--e--m] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin---institution--e---middlename] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin---institution---email-None] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin---institution---email--m] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAdd[--admin---institution---email---middlename] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAddGroup[id] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAddGroup[name] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAddGroup[--group-id] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAddGroup[--group-name] PASSED [ 50%] test/integration/clitest/test_user.py::TestUserRoot::testAddPassword[True-None] PASSED [ 51%] test/integration/clitest/test_user.py::TestUserRoot::testAddPassword[True--P] PASSED [ 51%] test/integration/clitest/test_user.py::TestUserRoot::testAddPassword[True---userpassword] PASSED [ 51%] test/integration/clitest/test_user.py::TestUserRoot::testAddPassword[False-None] PASSED [ 51%] test/integration/clitest/test_user.py::TestUserRoot::testAddPassword[False--P] PASSED [ 51%] test/integration/clitest/test_user.py::TestUserRoot::testAddPassword[False---userpassword] PASSED [ 51%] test/integration/clitest/test_user.py::TestUserRoot::testAddNoPassword PASSED [ 51%] test/integration/clitest/test_user.py::TestUserRoot::testPassword[True] PASSED [ 51%] test/integration/clitest/test_user.py::TestUserRoot::testPassword[False] PASSED [ 51%] test/integration/fstest/test_rename.py::TestRename::test_dir PASSED [ 51%] test/integration/fstest/test_rename.py::TestRename::test_rename_permissions[data0] PASSED [ 51%] test/integration/fstest/test_rename.py::TestRename::test_rename_permissions[data1] PASSED [ 51%] test/integration/fstest/test_rename.py::TestRename::test_rename_permissions[data2] PASSED [ 51%] test/integration/fstest/test_rename.py::TestRename::test_rename_permissions[data3] PASSED [ 51%] test/integration/fstest/test_rename.py::TestRename::test_rename_permissions[data4] PASSED [ 51%] test/integration/fstest/test_rename.py::TestRename::test_rename_annotation PASSED [ 51%] test/integration/fstest/test_rename.py::TestRename::test_prep_and_delete PASSED [ 51%] test/integration/gatewaytest/test_annotation.py::testSameOwner PASSED [ 51%] test/integration/gatewaytest/test_annotation.py::testCommentAnnotation PASSED [ 51%] test/integration/gatewaytest/test_annotation.py::testNonDefGroupAnnotation PASSED [ 51%] test/integration/gatewaytest/test_annotation.py::testTimestampAnnotation PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testBooleanAnnotation PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testLongAnnotation PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testMapAnnotation PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testDualLinkedAnnotation PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testListAnnotations PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testFileAnnotation PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testFileAnnotationNoName PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testFileAnnotationSpeed PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testFileAnnNonDefaultGroup PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testUnlinkAnnotation PASSED [ 52%] test/integration/gatewaytest/test_annotation.py::testAnnotationCount PASSED [ 52%] test/integration/gatewaytest/test_chgrp.py::testImageChgrp PASSED [ 52%] test/integration/gatewaytest/test_chgrp.py::testDatasetChgrp PASSED [ 52%] test/integration/gatewaytest/test_chgrp.py::testPDIChgrp PASSED [ 52%] test/integration/gatewaytest/test_chgrp.py::testTwoDatasetsChgrpToProject PASSED [ 52%] test/integration/gatewaytest/test_chgrp.py::testMultiDatasetDoAll PASSED [ 52%] test/integration/gatewaytest/test_chmod.py::TestChmodGroup::testChmod PASSED [ 52%] test/integration/gatewaytest/test_chmod.py::TestCustomUsers::testReadOnly PASSED [ 52%] test/integration/gatewaytest/test_chmod.py::TestCustomUsers::testReadAnnotate PASSED [ 52%] test/integration/gatewaytest/test_chmod.py::TestCustomUsers::testGroupMinusOne PASSED [ 52%] test/integration/gatewaytest/test_chmod.py::TestCustomUsers::testReadWrite PASSED [ 53%] test/integration/gatewaytest/test_chmod.py::TestCustomUsers::testDelete8723 PASSED [ 53%] test/integration/gatewaytest/test_chmod.py::TestManualCreateEdit::testReadOnly PASSED [ 53%] test/integration/gatewaytest/test_chmod.py::Test8800::testWithBlitzWrappers PASSED [ 53%] test/integration/gatewaytest/test_chmod.py::Test8800::testWithoutWrappers PASSED [ 53%] test/integration/gatewaytest/test_chmod.py::TestDefaultSetup::testAuthorCanEdit PASSED [ 53%] test/integration/gatewaytest/test_chown.py::TestChown::test_chown_project PASSED [ 53%] test/integration/gatewaytest/test_chown.py::TestChown::test_chown_pdi PASSED [ 53%] test/integration/gatewaytest/test_config_service.py::testInterpolateSetting[foo] PASSED [ 53%] test/integration/gatewaytest/test_config_service.py::testInterpolateSetting[False] PASSED [ 53%] test/integration/gatewaytest/test_config_service.py::testInterpolateSetting[false] PASSED [ 53%] test/integration/gatewaytest/test_config_service.py::testInterpolateSetting[True] PASSED [ 53%] test/integration/gatewaytest/test_config_service.py::testInterpolateSetting[true] PASSED [ 53%] test/integration/gatewaytest/test_config_service.py::testInterpolateSetting[] PASSED [ 53%] test/integration/gatewaytest/test_config_service.py::testInterpolateSetting[None] PASSED [ 53%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testMultiProcessSession PASSED [ 53%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testClose ERROR [ 53%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testTopLevelObjects ERROR [ 53%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testCloseSession PASSED [ 53%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testMiscellaneous PASSED [ 53%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testConnectUsingClient PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testConnectUsingClientNoSessionWithIdentity PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testConnectUsingClientSessionWithoutIdentity PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testSessionId PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testConnectWithSessionId PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testSecureWithSecureClient PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testSecureWithUnsecureClient PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testSecureWithUsername[None] PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testSecureWithUsername[False] PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testSecureWithUsername[True] PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testSecureMisMatch PASSED [ 54%] test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testHost PASSED [ 54%] test/integration/gatewaytest/test_delete.py::TestDelete::testDeleteObjectsUnwrapped PASSED [ 54%] test/integration/gatewaytest/test_delete.py::TestDelete::testDeleteObjects PASSED [ 54%] test/integration/gatewaytest/test_delete.py::TestDelete::testDeleteObjectsWait PASSED [ 54%] test/integration/gatewaytest/test_delete.py::TestDelete::testDeleteObjectsDryRun PASSED [ 54%] test/integration/gatewaytest/test_delete.py::TestDelete::testDeleteAnnotatedFileAnnotation PASSED [ 54%] test/integration/gatewaytest/test_delete.py::TestDelete::testDeleteObjectDirect PASSED [ 54%] test/integration/gatewaytest/test_fs.py::TestFileset::testCountArchivedFiles PASSED [ 54%] test/integration/gatewaytest/test_fs.py::TestFileset::testCountFilesetFiles PASSED [ 54%] test/integration/gatewaytest/test_fs.py::TestFileset::testCountImportedImageFiles PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetImportedFilesInfo PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetImportedFilesInfoWithAnnotations PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetArchivedFiles PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetImportedImageFiles PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetArchivedFilesInfo PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetFilesetFilesInfo PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetFilesetFilesInfoWithAnnotations PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetFilesetFilesInfoMultiple PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetFilesetFilesInfoMultipleWithAnnotations PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetFileset PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestFileset::testGetImportedImageFilePaths PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testCountArchivedFiles PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testCountFilesetFiles PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testCountImportedImageFiles PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testGetImportedFilesInfo PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testGetArchivedFiles PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testGetImportedImageFiles PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testGetArchivedFilesInfo PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testGetFilesetFilesInfo PASSED [ 55%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testGetFilesetFilesInfoMultiple PASSED [ 56%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testGetFileset PASSED [ 56%] test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testGetImportedImageFilePaths PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestDeleteObject::testDeleteAnnotation PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestDeleteObject::testDeleteImage PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestFindObject::testIllegalObjTypeInt PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestFindObject::testObjTypeUnicode PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestFindObject::testObjTypeString PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestFindObject::testFindProject PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestFindObject::testFindExperimenter ERROR [ 56%] test/integration/gatewaytest/test_get_objects.py::TestFindObject::testFindAnnotation PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testSearchObjects PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testListProjects PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testPagination PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetDatasetsByProject PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testListExperimentersAndGroups[True] PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testListExperimentersAndGroups[False] PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testListColleagues PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testFindExperimenterWithGroups PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetExperimentersByGroup[True] PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetExperimentersByGroup[False] PASSED [ 56%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetExperimenter PASSED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetAnnotations ERROR [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetImage ERROR [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetImageLoadPixels[True-True] ERROR [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetImageLoadPixels[True-False] ERROR [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetImageLoadPixels[False-True] ERROR [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetImageLoadPixels[False-False] ERROR [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetProject PASSED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testTraversal ERROR [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testListOrphans[False0-True] PASSED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testListOrphans[False0-False] PASSED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testListOrphans[False1-True] PASSED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testListOrphans[False1-False] PASSED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testOrderById PASSED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetObjectsByMapAnnotations[Image] FAILED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetObjectsByMapAnnotations[Dataset] FAILED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetObjectsByMapAnnotations[Project] FAILED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetObjectsByMapAnnotations[Screen] FAILED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetObjectsByMapAnnotations[Plate] FAILED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGetGroupsLeaderOfAsLeader PASSED [ 57%] test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGetGroupsLeaderOfAsMember PASSED [ 58%] test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGetGroupsMemberOf PASSED [ 58%] test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGroupSummaryAsOwner PASSED [ 58%] test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGroupSummaryAsMember PASSED [ 58%] test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGroupSummaryAsOwnerDeprecated PASSED [ 58%] test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGroupSummaryAsMemberDeprecated PASSED [ 58%] test/integration/gatewaytest/test_get_objects.py::TestListParents::testSupportedObjects PASSED [ 58%] test/integration/gatewaytest/test_get_objects.py::TestListParents::testListParentsPDI PASSED [ 58%] test/integration/gatewaytest/test_get_objects.py::TestListParents::testListParentsSPW PASSED [ 58%] test/integration/gatewaytest/test_get_objects.py::TestListParents::testExperimenterListParents PASSED [ 58%] test/integration/gatewaytest/test_helpers.py::TestHelperObjects::testColorHolder PASSED [ 58%] test/integration/gatewaytest/test_helpers.py::TestHelperObjects::testOmeroType PASSED [ 58%] test/integration/gatewaytest/test_helpers.py::TestHelperObjects::testSplitHTMLColor PASSED [ 58%] test/integration/gatewaytest/test_image.py::TestImage::testThumbnail ERROR [ 58%] test/integration/gatewaytest/test_image.py::TestImage::testThumbnailSet ERROR [ 58%] test/integration/gatewaytest/test_image.py::TestImage::testRenderingModels ERROR [ 58%] test/integration/gatewaytest/test_image.py::TestImage::testSplitChannel ERROR [ 58%] test/integration/gatewaytest/test_image.py::TestImage::testLinePlots ERROR [ 58%] test/integration/gatewaytest/test_image.py::TestImage::testProjections ERROR [ 58%] test/integration/gatewaytest/test_image.py::TestImage::testProperties ERROR [ 58%] test/integration/gatewaytest/test_image.py::TestImage::testPixelSizeUnits ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testUnitsGetValue ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testChannelWavelengthUnits ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testExposureTimeUnits ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testShortname ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testImageDate ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testSimpleMarshal ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testExport ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testRenderJpegRegion ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testRenderJpegRegion_resolution ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testRenderJpegRegion_invalid_resolution ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testRenderBirdsEyeView ERROR [ 59%] test/integration/gatewaytest/test_image.py::TestImage::testRenderBirdsEyeView_Size ERROR [ 59%] test/integration/gatewaytest/test_image_wrapper.py::TestImageWrapper::testGetDate PASSED [ 59%] test/integration/gatewaytest/test_image_wrapper.py::TestImageWrapper::testGetDateNoAcquisitionDate PASSED [ 59%] test/integration/gatewaytest/test_image_wrapper.py::TestImageWrapper::testSimpleMarshal PASSED [ 59%] test/integration/gatewaytest/test_image_wrapper.py::TestImageWrapper::testChannelLabel PASSED [ 59%] test/integration/gatewaytest/test_image_wrapper.py::TestImageWrapper::testChannelEmissionWaveLabel PASSED [ 59%] test/integration/gatewaytest/test_image_wrapper.py::TestImageWrapper::testChannelNoLabel PASSED [ 59%] test/integration/gatewaytest/test_missing_pyramid.py::TestPyramid::testThrowException PASSED [ 59%] test/integration/gatewaytest/test_missing_pyramid.py::TestPyramid::testPrepareRenderingEngine PASSED [ 60%] test/integration/gatewaytest/test_missing_pyramid.py::TestPyramid::testGetChannels PASSED [ 60%] test/integration/gatewaytest/test_missing_pyramid.py::TestPyramid::testGetChannelsNoRe PASSED [ 60%] test/integration/gatewaytest/test_missing_pyramid.py::TestPyramid::testGetRdefId PASSED [ 60%] test/integration/gatewaytest/test_permissions.py::TestPrivileges::test_update_admin_privileges PASSED [ 60%] test/integration/gatewaytest/test_permissions.py::TestPrivileges::test_full_admin_privileges PASSED [ 60%] test/integration/gatewaytest/test_pixels.py::TestPixels::testReuseRawPixelsStore ERROR [ 60%] test/integration/gatewaytest/test_pixels.py::TestPixels::testPlaneInfo ERROR [ 60%] test/integration/gatewaytest/test_pixels.py::TestPixels::testPixelsType ERROR [ 60%] test/integration/gatewaytest/test_pixels.py::TestPixels::testGetTile ERROR [ 60%] test/integration/gatewaytest/test_pixels.py::TestPixels::testGetPlane ERROR [ 60%] test/integration/gatewaytest/test_pixels.py::TestPixels::testGetPlanesExceptionOnGetPlane ERROR [ 60%] test/integration/gatewaytest/test_pixels.py::TestPixels::testGetPlanesExceptionOnClose ERROR [ 60%] test/integration/gatewaytest/test_pixels.py::TestPixels::testGetPlanesExceptionOnBoth ERROR [ 60%] test/integration/gatewaytest/test_pixels.py::TestPixels::testGetHistogram ERROR [ 60%] test/integration/gatewaytest/test_plate_wrapper.py::TestPlateWrapper::testGetGridSize PASSED [ 60%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testDefault ERROR [ 60%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testCustomized ERROR [ 60%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testChannelWindows ERROR [ 60%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testFloatDefaultMinMax ERROR [ 60%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testEmissionWave ERROR [ 60%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testBatchCopy ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testGroupBasedPermissions ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testGetRdefs ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testResetDefaults ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testQuantizationSettings ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testQuantizationSettingsInvalid ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testQuantizationSettingsBulk ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testGetChannelsNoRE ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testSetActiveChannelsNoRE ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testSetActiveChannelsWithRE ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::test_set_active_channels_set_inactive[True] ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::test_set_active_channels_set_inactive[False] ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testUnregisterService ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testRegisterService ERROR [ 61%] test/integration/gatewaytest/test_rdefs.py::TestRDefs::testCloseRE ERROR [ 61%] test/integration/gatewaytest/test_search_objects.py::TestGetObject::testSearchObjects PASSED [ 61%] test/integration/gatewaytest/test_services.py::TestServices::testDeleteServiceAuthor PASSED [ 61%] test/integration/gatewaytest/test_services.py::TestServices::testDeleteServiceAdmin PASSED [ 61%] test/integration/gatewaytest/test_services.py::TestTables::testTableRead PASSED [ 61%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rw_root_dirTrue_grpTrue_16 PASSED [ 61%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rw_root_dirTrue_grpTrue_96 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rw_root_dirTrue_grpFalse_16 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rw_root_dirTrue_grpFalse_96 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rw_root_dirFalse_grpTrue_16 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rw_root_dirFalse_grpTrue_96 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rw_root_dirFalse_grpFalse_16 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rw_root_dirFalse_grpFalse_96 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_root_dirTrue_grpTrue_16 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_root_dirTrue_grpTrue_96 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_root_dirTrue_grpFalse_16 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_root_dirTrue_grpFalse_96 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_root_dirFalse_grpTrue_16 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_root_dirFalse_grpTrue_96 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_root_dirFalse_grpFalse_16 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_root_dirFalse_grpFalse_96 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_member_dirTrue_grpTrue_16 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_member_dirTrue_grpTrue_96 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_member_dirTrue_grpFalse_16 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_member_dirTrue_grpFalse_96 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_member_dirFalse_grpTrue_16 PASSED [ 62%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_member_dirFalse_grpTrue_96 PASSED [ 63%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_member_dirFalse_grpFalse_16 PASSED [ 63%] test/integration/gatewaytest/test_ticket10618.py::Test10618::test_rwr_member_dirFalse_grpFalse_96 PASSED [ 63%] test/integration/gatewaytest/test_user.py::TestUser::testUsers PASSED [ 63%] test/integration/gatewaytest/test_user.py::TestUser::testSaveAs FAILED [ 63%] test/integration/gatewaytest/test_user.py::TestUser::testCrossGroupSave PASSED [ 63%] test/integration/gatewaytest/test_user.py::TestUser::testGroupOverObjPermissions PASSED [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testAllObjectsWrapped ERROR [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testProjectWrapper PASSED [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testDatasetWrapper ERROR [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testExperimenterWrapper PASSED [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testDetailsWrapper ERROR [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testOriginalFileWrapperGetFileInChunks PASSED [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testOriginalFileWrapperAsFileObj PASSED [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testOriginalFileWrapperAsFileObjContextManager PASSED [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testOriginalFileWrapperAsFileObjMultiple PASSED [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testSetters PASSED [ 63%] test/integration/gatewaytest/test_wrapper.py::TestWrapper::testOther PASSED [ 63%] test/integration/metadata/test_metadata_mapannotations.py::TestMapAnnotationManager::test_add_from_namespace_query PASSED [ 63%] test/integration/metadata/test_metadata_mapannotations.py::TestMapAnnotationManager::test_add_from_namespace_query_duplicate PASSED [ 63%] test/integration/metadata/test_metadata_mapannotations.py::TestMapAnnotationManager::test_update_existing_mapann PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadataConfigLoad::test_get_config_local PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadataConfigLoad::test_get_config_remote PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[None-Screen2Plates] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[None-Plate2Wells] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[None-Dataset2Images] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[None-Dataset2Images1Missing] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[None-Dataset101Images] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[None-Project2Datasets] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[None-GZIP] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[1-Screen2Plates] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[1-Plate2Wells] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[1-Dataset2Images] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[1-Dataset2Images1Missing] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[1-Dataset101Images] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[1-Project2Datasets] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[1-GZIP] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[10-Screen2Plates] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[10-Plate2Wells] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[10-Dataset2Images] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[10-Dataset2Images1Missing] PASSED [ 64%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[10-Dataset101Images] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[10-Project2Datasets] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[10-GZIP] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadataNsAnns[Plate2WellsNs] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadataNsAnns[Plate2WellsNs2] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadataNsAnnsUnavailableHeader PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadataNsAnnsFail PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataDedup::testPopulateMetadataNsAnnsDedup[None] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataDedup::testPopulateMetadataNsAnnsDedup[openmicroscopy.org/omero/bulk_annotations] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataDedup::testPopulateMetadataNsAnnsDedup[openmicroscopy.org/mapr/gene] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataDedup::testPopulateMetadataNsAnnsDedupDelete[None] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataDedup::testPopulateMetadataNsAnnsDedupDelete[openmicroscopy.org/omero/bulk_annotations] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataDedup::testPopulateMetadataNsAnnsDedupDelete[openmicroscopy.org/mapr/gene] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataConfigFiles::test_delete_attach[True-None] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataConfigFiles::test_delete_attach[True-openmicroscopy.org/omero/bulk_annotations] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataConfigFiles::test_delete_attach[True-openmicroscopy.org/omero/bulk_annotations/config] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataConfigFiles::test_delete_attach[False-None] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataConfigFiles::test_delete_attach[False-openmicroscopy.org/omero/bulk_annotations] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateMetadataConfigFiles::test_delete_attach[False-openmicroscopy.org/omero/bulk_annotations/config] PASSED [ 65%] test/integration/metadata/test_populate.py::TestPopulateRois::testPopulateRoisPlate PASSED [ 65%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_get_format_filename_yaml[yaml] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_get_format_filename_yaml[Yml] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_get_format_filename_json[json] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_get_format_filename_json[JS] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_get_format_originalfileid_yaml[format0] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_get_format_originalfileid_yaml[format1] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_get_format_originalfileid_yaml[format2] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_get_format_originalfileid_json[format0] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_get_format_originalfileid_json[format1] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_get_format_originalfileid_json[format2] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_load[json-True] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_load[json-False] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_load[yaml-True] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_load[yaml-False] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_load_fromstring PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_load_invalidtype PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_dump[json] PASSED [ 66%] test/integration/metadata/test_pydict_text.py::TestPydictTextIo::test_dump[yaml] PASSED [ 66%] test/integration/scriptsharness/test_harness.py::TestScriptsViaOmeroCli::testDefinition PASSED [ 66%] test/integration/scriptsharness/test_harness.py::TestScriptsViaOmeroCli::testSimpleScript PASSED [ 66%] test/integration/scriptstest/test_cli.py::TestCLI::testCLI PASSED [ 67%] test/integration/scriptstest/test_coverage.py::TestCoverage::testGetScriptWithDetails PASSED [ 67%] test/integration/scriptstest/test_coverage.py::TestCoverage::testUploadAndScript PASSED [ 67%] test/integration/scriptstest/test_coverage.py::TestCoverage::testUserCantUploadOfficalScript PASSED [ 67%] test/integration/scriptstest/test_inputs.py::TestInputs::testInputs PASSED [ 67%] test/integration/scriptstest/test_ping.py::TestPing::testPingViaISCript PASSED [ 67%] test/integration/scriptstest/test_ping.py::TestPing::testPingParametersViaISCript PASSED [ 67%] test/integration/scriptstest/test_ping.py::TestPing::testPingStdout PASSED [ 67%] test/integration/scriptstest/test_ping.py::TestPing::testProcessShutdown PASSED [ 67%] test/integration/scriptstest/test_ping.py::TestPing::testProcessShutdownOneway PASSED [ 67%] test/integration/scriptstest/test_ping.py::TestPing::testProcessorGetResultsBeforeFinished PASSED [ 67%] test/integration/scriptstest/test_ping.py::TestPing::testProcessorExpires PASSED [ 67%] test/integration/scriptstest/test_ping.py::TestPing::testProcessorGetJob PASSED [ 67%] test/integration/scriptstest/test_ping.py::TestPing::testProcessorStop PASSED [ 67%] test/integration/scriptstest/test_ping.py::TestPing::testProcessorDetach PASSED [ 67%] test/integration/scriptstest/test_rand.py::TestRand::testRand PASSED [ 67%] test/integration/scriptstest/test_repo.py::TestScriptRepo::testScriptRepo PASSED [ 67%] test/integration/scriptstest/test_repo.py::TestScriptRepo::testGetOfficialScripts PASSED [ 67%] test/integration/scriptstest/test_repo.py::TestScriptRepo::testGetUserScripts PASSED [ 67%] test/integration/scriptstest/test_repo.py::TestScriptRepo::testCantUndulyLoadScriptRepoFromUuid PASSED [ 67%] test/integration/scriptstest/test_repo.py::TestScriptRepo::testMultipleScriptPathsNotSupported PASSED [ 68%] test/integration/scriptstest/test_repo.py::TestScriptRepo::testUploadingViaOfficialScriptShowsUpInRepo PASSED [ 68%] test/integration/scriptstest/test_repo.py::TestScriptRepo::testUploadingViaNonOfficialScriptDoesntShowUpInRepo PASSED [ 68%] test/integration/scriptstest/test_roi_handling_utils.py::TestRoiHandlingUtils::test_get_line_data PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_split_image PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_numpy_to_image PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_convert_numpy_array PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_numpy_save_as_image[True-tiff] Exception ignored in: Traceback (most recent call last): File "/usr/lib64/python3.9/tempfile.py", line 461, in __del__ self.close() File "/usr/lib64/python3.9/tempfile.py", line 457, in close unlink(self.name) FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpylg79jb8.tiff' PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_numpy_save_as_image[True-foo] Exception ignored in: Traceback (most recent call last): File "/usr/lib64/python3.9/tempfile.py", line 461, in __del__ self.close() File "/usr/lib64/python3.9/tempfile.py", line 457, in close unlink(self.name) FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpw_sr2e4p.foo' PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_numpy_save_as_image[False-tiff] PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_numpy_save_as_image[False-foo] PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_create_file[] PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_create_file[text/x-python] PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_calc_sha1 PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_upload_file PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_download_file PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_get_objects PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_download_plane PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_upload_plane_by_row PASSED [ 68%] test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_upload_plane PASSED [ 68%] test/integration/tablestest/test_backwards_compatibility.py::TestBackwardsCompatibility_4_4_5::testCreateAllColumns_4_4_5 PASSED [ 68%] test/integration/tablestest/test_backwards_compatibility.py::TestBackwardsCompatibility_4_4_5::testAllColumns_4_4_5 PASSED [ 69%] test/integration/tablestest/test_backwards_compatibility.py::TestBackwardsCompatibility_4_4_5::testMetadataException PASSED [ 69%] test/integration/tablestest/test_backwards_compatibility.py::TestBackwardsCompatibility_5_3_4::testCreateAllColumnsAndMetadata_5_3_4 PASSED [ 69%] test/integration/tablestest/test_backwards_compatibility.py::TestBackwardsCompatibility_5_3_4::testAllColumnsAndMetadata_5_3_4 PASSED [ 69%] test/integration/tablestest/test_populate_metadata.py::TestPopulateMetadata::testPopulateMetadataPlate PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::testBlankTable PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::testUpdate PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::testTicket2175 PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::testMask PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::test2855MetadataMethods PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::test2910 PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::testDelete PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::test3714GetWhereListVars PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::test4000TableRead PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::testCallContext PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::testGetHeaders PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::test10049openTableUnreadable PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::test9971checkStringLength PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::testArrayColumn PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::testArrayColumnSize1 PASSED [ 69%] test/integration/tablestest/test_service.py::TestTables::testAllColumnsSameTable PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::test10431uninitialisedTableReadWrite PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::test12606fileSizeCheck PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testCantWriteInternalMetadata[data0] PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testCantWriteInternalMetadata[data1] PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testCanWriteAlmostInternalMetadata[data0] PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testCanWriteAlmostInternalMetadata[data1] PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testCanReadInternalMetadata PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testColumnSpaceNames PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testIncludeRowNumbersSlice PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testIncludeRowNumbersReadCoordinates PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testIncludeRowNumbersRead PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testSliceOrdering PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testReadCoordinatesOrdering PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testSliceEmptyInput PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testReadCoordinatesInvalidInput PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testReadInvalidInput PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testSliceInvalidInput PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testReadStartEnd PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testReadStartGreaterThanEnd PASSED [ 70%] test/integration/tablestest/test_service.py::TestTables::testAddData PASSED [ 71%] test/integration/tablestest/test_service.py::TestTables::testUpdateFirstRow PASSED [ 71%] test/integration/tablestest/test_service.py::TestTables::testUpdateMultipleRows PASSED [ 71%] test/integration/tablestest/test_service.py::TestTables::testUpdateAllows PASSED [ 71%] test/integration/tablestest/test_service.py::TestTables::testCannotUpdateOutOfRange PASSED [ 71%] test/integration/tablestest/test_tables.py::TestTableIntegrity::testAllColumnsAndMetadata PASSED [ 71%] test/integration/test_admin.py::TestAdmin::testGetGroup PASSED [ 71%] test/integration/test_admin.py::TestAdmin::testSetGroup PASSED [ 71%] test/integration/test_admin.py::TestAdmin::testChangePassword PASSED [ 71%] test/integration/test_admin.py::TestAdmin::testGetEventContext4011 PASSED [ 71%] test/integration/test_admin.py::TestAdmin::testUserRoles4056 PASSED [ 71%] test/integration/test_admin.py::TestAdmin::testSetSecurityPassword PASSED [ 71%] test/integration/test_annotation.py::TestFigureExportScripts::testAddAnnotations PASSED [ 71%] test/integration/test_annotationPermissions.py::TestPrivateGroup::testAddTag PASSED [ 71%] test/integration/test_annotationPermissions.py::TestPrivateGroup::testReadTag PASSED [ 71%] test/integration/test_annotationPermissions.py::TestPrivateGroup::testRemoveTag PASSED [ 71%] test/integration/test_annotationPermissions.py::TestReadOnlyGroup::testAddTag PASSED [ 71%] test/integration/test_annotationPermissions.py::TestReadOnlyGroup::testReadTag PASSED [ 71%] test/integration/test_annotationPermissions.py::TestReadOnlyGroup::testRemoveTag PASSED [ 71%] test/integration/test_annotationPermissions.py::TestReadAnnotateGroup::testAddTag PASSED [ 71%] test/integration/test_annotationPermissions.py::TestReadAnnotateGroup::testReadTag PASSED [ 72%] test/integration/test_annotationPermissions.py::TestReadAnnotateGroup::testRemoveTag PASSED [ 72%] test/integration/test_annotationPermissions.py::TestMovePrivatePermissions::testAddTagMakePrivate[root] PASSED [ 72%] test/integration/test_annotationPermissions.py::TestMovePrivatePermissions::testAddTagMakePrivate[admin] PASSED [ 72%] test/integration/test_annotationPermissions.py::TestMovePrivatePermissions::testAddTagMakePrivate[owner] PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpImportedImage PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpImage PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpPDI PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpRdef7825 PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpOneImageFilesetErr PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpAllImagesFilesetOK PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpAllImagesFilesetTwoCommandsErr PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpOneDatasetFilesetErr PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpAllDatasetsFilesetOK PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpOneDatasetFilesetOK PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpImagesTwoFilesetsErr PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpDatasetTwoFilesetsErr PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpDatasetCheckFsGroup PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpFilesetOK PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrp11000 PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrp11109 PASSED [ 72%] test/integration/test_chgrp.py::TestChgrp::testChgrpDatasetWithImage PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpPDIReverseLinkOrder PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpTwoDatasetsLinkedToSingleImageDefault PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpTwoDatasetsLinkedToSingleImageHard PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpProjectWithDatasetLinkedToImageWithOtherDatasetDefault PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpProjectWithDatasetLinkedToImageWithOtherDatasetHard PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpDatasetWithImageLinkedToTwoProjects PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpProjectLinkedToDatasetAndImageDefault PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpProjectLinkedToDatasetAndImageHard PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpProjectLinkedToDatasetDefault PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpProjectLinkedToDatasetHard PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testChgrpProjectLinkedToTwoDatasetsAndImage PASSED [ 73%] test/integration/test_chgrp.py::TestChgrp::testIntergroupLinks PASSED [ 73%] test/integration/test_chgrp.py::TestChgrpTarget::testChgrpImageToTargetDataset PASSED [ 73%] test/integration/test_chgrp.py::TestChgrpTarget::testChgrpMifImagesToTargetDataset PASSED [ 73%] test/integration/test_chgrp.py::TestChgrpTarget::testChgrpImageToTargetDatasetAndBackNoDS PASSED [ 73%] test/integration/test_chgrp.py::TestChgrpTarget::testChgrpImageToTargetDatasetAndBackDS PASSED [ 73%] test/integration/test_chgrp.py::TestChgrpTarget::testChgrpDatasetToTargetProject[user] PASSED [ 73%] test/integration/test_chgrp.py::TestChgrpTarget::testChgrpDatasetToTargetProject[admin] PASSED [ 73%] test/integration/test_chmod.py::TestChmodEasy::test_chmod_rw_rwr PASSED [ 73%] test/integration/test_chown.py::TestChown::test_chown_project PASSED [ 74%] test/integration/test_chown.py::TestChown::test_chown_pdi PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testHostConstructor PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testEmptyInitializationDataConstructor PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testInitializationDataConstructor PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testMainArgsConstructor PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testMapConstructor PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testMainArgsGetsIcePrefix PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testMainArgsGetsIceConfig PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testTwoDifferentHosts PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testPorts PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testBlockSize PASSED [ 74%] test/integration/test_client_ctors.py::TestClientConstructors::testPythonCtorRepair PASSED [ 74%] test/integration/test_clientusage.py::TestClientUsage::testClientClosedAutomatically PASSED [ 74%] test/integration/test_clientusage.py::TestClientUsage::testClientClosedManually PASSED [ 74%] test/integration/test_clientusage.py::TestClientUsage::testUseSharedMemory PASSED [ 74%] test/integration/test_clientusage.py::TestClientUsage::testCreateInsecureClientTicket2099 PASSED [ 74%] test/integration/test_clientusage.py::TestClientUsage::testGetStatefulServices PASSED [ 74%] test/integration/test_cmdcallback.py::TestCmdCallback::testTimingFinishesOnLatch PASSED [ 74%] test/integration/test_cmdcallback.py::TestCmdCallback::testTimingFinishesOnBlock PASSED [ 74%] test/integration/test_cmdcallback.py::TestCmdCallback::testTimingFinishesOnLoop PASSED [ 75%] test/integration/test_cmdcallback.py::TestCmdCallback::testDoNothingFinishesOnLatch PASSED [ 75%] test/integration/test_cmdcallback.py::TestCmdCallback::testDoNothingFinishesOnLoop PASSED [ 75%] test/integration/test_cmdcallback.py::TestCmdCallback::testDoAllTimingFinishesOnLoop PASSED [ 75%] test/integration/test_counts.py::TestCounts::testBasicUsage PASSED [ 75%] test/integration/test_delete.py::TestDelete::testBasicUsage PASSED [ 75%] test/integration/test_delete.py::TestDelete::testDeleteMany PASSED [ 75%] test/integration/test_delete.py::TestDelete::testDeleteProjectWithoutContent PASSED [ 75%] test/integration/test_delete.py::TestDelete::testCheckIfDeleted PASSED [ 75%] test/integration/test_delete.py::TestDelete::testCheckIfDeleted2 PASSED [ 75%] test/integration/test_delete.py::TestDelete::testOddMessage PASSED [ 75%] test/integration/test_delete.py::TestDelete::testDeleteComment PASSED [ 75%] test/integration/test_delete.py::TestDelete::test3639 PASSED [ 75%] test/integration/test_delete.py::TestDelete::test5793 PASSED [ 75%] test/integration/test_delete.py::TestDelete::test7314 PASSED [ 75%] test/integration/test_delete.py::TestDelete::testDeleteOneDatasetFilesetErr PASSED [ 75%] test/integration/test_delete.py::TestDelete::testDeleteOneImageFilesetErr PASSED [ 75%] test/integration/test_delete.py::TestDelete::testDeleteDatasetFilesetOK PASSED [ 75%] test/integration/test_delete.py::TestDelete::testDeleteAllDatasetsFilesetOK PASSED [ 75%] test/integration/test_delete.py::TestDelete::testDeleteAllImagesFilesetOK PASSED [ 75%] test/integration/test_delete.py::TestDelete::testDeleteFilesetOK PASSED [ 76%] test/integration/test_delete.py::TestDelete::testDeleteImagesTwoFilesetsErr PASSED [ 76%] test/integration/test_delete.py::TestDelete::testDeleteDatasetTwoFilesetsErr PASSED [ 76%] test/integration/test_delete.py::TestDelete::testDeleteProjectWithOneEmptyDataset PASSED [ 76%] test/integration/test_delete.py::TestDelete::testDeleteProjectWithEmptyDatasetLinkedToAnotherProjectDefault PASSED [ 76%] test/integration/test_delete.py::TestDelete::testDeleteProjectWithEmptyDatasetLinkedToAnotherProjectHard PASSED [ 76%] test/integration/test_delete.py::TestDelete::testDeleteProjectWithDatasetLinkedToAnotherProject PASSED [ 76%] test/integration/test_delete.py::TestDelete::testDeleteDatasetLinkedToTwoProjects PASSED [ 76%] test/integration/test_delete.py::TestDelete::testDeleteDatasetWithImageLinkedToAnotherDatasetDefault PASSED [ 76%] test/integration/test_delete.py::TestDelete::testDeleteDatasetWithImageLinkedToAnotherDatasetHard PASSED [ 76%] test/integration/test_delete.py::TestDelete::testStepsDuringDelete PASSED [ 76%] test/integration/test_exporter.py::TestExporter::testBasic PASSED [ 76%] test/integration/test_exporter.py::TestExporter::test6713 PASSED [ 76%] test/integration/test_files.py::TestFiles::testUploadDownload PASSED [ 76%] test/integration/test_files.py::TestFiles::test_download_null_size PASSED [ 76%] test/integration/test_iconfig.py::TestConfig::testValuesRegex[data0] PASSED [ 76%] test/integration/test_iconfig.py::TestConfig::testValuesRegex[data1] PASSED [ 76%] test/integration/test_iconfig.py::TestConfig::testValuesRegex[data2] PASSED [ 76%] test/integration/test_iconfig.py::TestConfig::testValuesRegex[data3] PASSED [ 76%] test/integration/test_iconfig.py::TestConfig::testDefaults PASSED [ 76%] test/integration/test_iconfig.py::TestConfig::testRootDefaults PASSED [ 76%] test/integration/test_iconfig.py::TestConfig::testClientDefaults PASSED [ 77%] test/integration/test_iconfig.py::TestConfig::testClientValues PASSED [ 77%] test/integration/test_icontainer.py::TestIContainer::testFindAnnotations PASSED [ 77%] test/integration/test_icontainer.py::TestIContainer::testFindAndCountAnnotationsForSharedData PASSED [ 77%] test/integration/test_icontainer.py::TestIContainer::testCreateAfterBlitzPort PASSED [ 77%] test/integration/test_icontainer.py::TestSplitFilesets::testFilesetSplitByImage PASSED [ 77%] test/integration/test_icontainer.py::TestSplitFilesets::testFilesetNotSplitByImage PASSED [ 77%] test/integration/test_icontainer.py::TestSplitFilesets::testFilesetSplitByDatasetAndProject PASSED [ 77%] test/integration/test_icontainer.py::TestSplitFilesets::testFilesetNotSplitByDatasets PASSED [ 77%] test/integration/test_icontainer.py::TestSplitFilesets::testGetImagesBySplitFilesetsManyCases PASSED [ 77%] test/integration/test_ildap.py::TestILdap::testLookupLdapExperimentersViaAdmin PASSED [ 77%] test/integration/test_imetadata.py::TestIMetadata::testLoadAnnotations3671 PASSED [ 77%] test/integration/test_imetadata.py::TestIMetadata::testLoadAnnotationsUsedNotOwned3671 PASSED [ 77%] test/integration/test_imetadata.py::TestIMetadata::testCountAnnotationsUsedNotOwned3671 PASSED [ 77%] test/integration/test_imetadata.py::TestIMetadata::testCountSpecifiedAnnotations3671 PASSED [ 77%] test/integration/test_imetadata.py::TestIMetadata::testLoadSpecifiedAnnotations3671 PASSED [ 77%] test/integration/test_iquery.py::TestQuery::testGetPixelsCount PASSED [ 77%] test/integration/test_iquery.py::TestQuery::testQueryTaggedUnique PASSED [ 77%] test/integration/test_iquery.py::TestQuery::testClassType PASSED [ 77%] test/integration/test_isession.py::TestISession::testBasicUsage PASSED [ 77%] test/integration/test_isession.py::TestISession::testManuallyClosingOwnSession PASSED [ 78%] test/integration/test_isession.py::TestISession::testCreateSessionForUser PASSED [ 78%] test/integration/test_isession.py::TestISession::testJoinSession_Helper PASSED [ 78%] test/integration/test_isession.py::TestISession::testJoinSession PASSED [ 78%] test/integration/test_isession.py::TestISession::testUpdateSessions[who0] PASSED [ 78%] test/integration/test_isession.py::TestISession::testUpdateSessions[who1] PASSED [ 78%] test/integration/test_isession.py::TestISession::testUpdateSessions[who2] PASSED [ 78%] test/integration/test_isession.py::TestISession::testUpdateSessions[who3] PASSED [ 78%] test/integration/test_isession.py::TestISession::testUpdateSessionsNonAdminDisabling PASSED [ 78%] test/integration/test_isession.py::TestISession::testCreateSessionForGuest PASSED [ 78%] test/integration/test_isession.py::TestISession::test1018CreationDestructionClosing PASSED [ 78%] test/integration/test_isession.py::TestISession::testSimpleDestruction PASSED [ 78%] test/integration/test_isession.py::TestISession::testGetMySessionsTicket1975 PASSED [ 78%] test/integration/test_isession.py::TestISession::testTicket2196SetSecurityContext PASSED [ 78%] test/integration/test_isession.py::TestISession::testManageMySessions PASSED [ 78%] test/integration/test_isession.py::TestISession::testSessionWithIP[127.0.0.1] PASSED [ 78%] test/integration/test_isession.py::TestISession::testSessionWithIP[2400:cb00:2048:1::6814:55] PASSED [ 78%] test/integration/test_isession.py::TestISession::testSessionWithIP[1234:5678:1234:5678:1234:5678:121.212.121.212] PASSED [ 78%] test/integration/test_isession.py::TestISession::testCreateUserSession PASSED [ 78%] test/integration/test_isession.py::TestISession::testCreateUserSessionFromSudo[True] PASSED [ 78%] test/integration/test_isession.py::TestISession::testCreateUserSessionFromSudo[False] PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test_that_permissions_are_default_private PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test_basic_usage PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test_canDoAction[canEdit] PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test_canDoAction[canAnnotate] PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test_canDoAction[canDelete] PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test_canDoAction[canLink] PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test8118 PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test1154 PASSED [ 79%] test/integration/test_ishare.py::TestIShare::testCanAnnotate PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test1157 PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test1179 PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test1201 PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test1201b PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test1207 PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test1227 PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test2327 PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test2733 PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test2733Access PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test3214 PASSED [ 79%] test/integration/test_ishare.py::TestIShare::test5711 PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test5756Raw PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test5756Wrapped PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test5851 PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test2513 PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test_OS_regular_user PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test_OS_non_member PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test_OS_admin_user PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test_bad_share PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test8513 PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test8704 PASSED [ 80%] test/integration/test_ishare.py::TestIShare::test13018 PASSED [ 80%] test/integration/test_itimeline.py::TestITimeline::testGeneral PASSED [ 80%] test/integration/test_itimeline.py::TestITimeline::testCollaborativeTimeline PASSED [ 80%] test/integration/test_itimeline.py::TestITimeline::test1173 PASSED [ 80%] test/integration/test_itimeline.py::TestITimeline::test1175 PASSED [ 80%] test/integration/test_itimeline.py::TestITimeline::test3234 PASSED [ 80%] test/integration/test_itypes.py::TestTypes::testGetEnumerationTypes PASSED [ 80%] test/integration/test_itypes.py::TestTypes::testAllEnumerations PASSED [ 80%] test/integration/test_itypes.py::TestTypes::testGetEnumerationWithEntries PASSED [ 80%] test/integration/test_itypes.py::TestTypes::testManageEnumeration PASSED [ 80%] test/integration/test_iupdate.py::TestIUpdate::testSaveArray PASSED [ 81%] test/integration/test_iupdate.py::TestIUpdate::testSaveCollection PASSED [ 81%] test/integration/test_iupdate.py::TestIUpdate::testExternalInfoOnCreation PASSED [ 81%] test/integration/test_iupdate.py::TestIUpdate::testExternalInfoAfterCreationTransient PASSED [ 81%] test/integration/test_iupdate.py::TestIUpdate::testExternalInfoAfterCreationManaged PASSED [ 81%] test/integration/test_iupdate.py::TestIUpdate::testExternalInfoNewInstance PASSED [ 81%] test/integration/test_iupdate.py::TestIUpdate::testExternalInfoNullInstance PASSED [ 81%] test/integration/test_iupdate.py::TestIUpdate::testExternalInfoUpdateInstance PASSED [ 81%] test/integration/test_iupdate.py::TestIUpdate::testCannotSaveDeleted PASSED [ 81%] test/integration/test_librarytest.py::TestLibrary::test9188 PASSED [ 81%] test/integration/test_mail.py::TestMail::testEveryone PASSED [ 81%] test/integration/test_mail.py::TestMail::testUserAdd PASSED [ 81%] test/integration/test_mail.py::TestMail::testComment PASSED [ 81%] test/integration/test_mapannotation.py::TestMapAnnotation::testMapStringField PASSED [ 81%] test/integration/test_mapannotation.py::TestMapAnnotation::testGroupConfigA[data0] PASSED [ 81%] test/integration/test_mapannotation.py::TestMapAnnotation::testGroupConfigA[data1] PASSED [ 81%] test/integration/test_mapannotation.py::TestMapAnnotation::testGroupConfigEdit PASSED [ 81%] test/integration/test_mapannotation.py::TestMapAnnotation::testEmptyItem PASSED [ 81%] test/integration/test_mapannotation.py::TestMapAnnotation::testBigKeys PASSED [ 81%] test/integration/test_metadatastore.py::TestMetadataStore::testBasicUsage PASSED [ 81%] test/integration/test_metadatastore.py::TestMetadataStore::testMetadataService PASSED [ 82%] test/integration/test_model51.py::TestModel51::testExposureTime PASSED [ 82%] test/integration/test_model51.py::TestModel51::testPhysicalSize PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[YOTTAMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[ZETTAMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[EXAMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[PETAMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[TERAMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[GIGAMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[MEGAMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[KILOMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[HECTOMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[DECAMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[METER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[DECIMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[CENTIMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[MILLIMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[MICROMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[NANOMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[PICOMETER] PASSED [ 82%] test/integration/test_model51.py::TestModel51::testAllLengths[FEMTOMETER] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[ATTOMETER] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[ZEPTOMETER] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[YOCTOMETER] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[ANGSTROM] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[ASTRONOMICALUNIT] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[LIGHTYEAR] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[PARSEC] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[THOU] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[LINE] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[INCH] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[FOOT] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[YARD] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[MILE] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[POINT] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[PIXEL] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAllLengths[REFERENCEFRAME] PASSED [ 83%] test/integration/test_model51.py::TestModel51::testAsMapMethod PASSED [ 83%] test/integration/test_model51.py::TestModel51::testMapEagerFetch PASSED [ 83%] test/integration/test_model51.py::TestModel51::testMapSecurity PASSED [ 83%] test/integration/test_model51.py::TestModel51::testUnitProjections PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testLoginToPublicGroupTicket1940 PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testLinkingInPrivateGroup PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testCreatAndUpdatePrivateGroup PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testCreatAndUpdatePublicGroupReadOnly PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testCreatAndUpdatePublicGroupReadAnnotate PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testCreatAndUpdatePublicGroup PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testCreatGroupAndchangePermissions PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testGroupOwners PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testSearchAllGroups PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testOGContextParameter PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testOGSetImplicitContext PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testOGSetProxyContext PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testOGSetSecurityContext PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testOGArg PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testSaveWithNegOneNotExplicit PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testPrivateGroupCallContext PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testAdminCanQueryWithGroupMinusOneTicket9632 PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testOmeroUserAsAdmin PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testOmeroUserAsNonAdmin PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testImmutablePermissions PASSED [ 84%] test/integration/test_permissions.py::TestPermissions::testDisallow PASSED [ 85%] test/integration/test_permissions.py::TestPermissions::testClientSet PASSED [ 85%] test/integration/test_permissions.py::TestPermissions::testAdminUseOfRawPixelsBean PASSED [ 85%] test/integration/test_permissions.py::TestPermissions::testUseOfRawFileBeanScriptReadGroupMinusOne PASSED [ 85%] test/integration/test_permissions.py::TestPermissions::testUseOfRawFileBeanScriptReadCorrectGroup PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rw-----admin-admin] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rw-----admin-owner] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rw-----admin-member] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rw-----owner-admin] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rw-----owner-owner] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rw-----owner-member] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rw-----member-admin] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rw-----member-owner] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rw-----member-member] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwr----admin-admin] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwr----admin-owner] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwr----admin-member] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwr----owner-admin] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwr----owner-owner] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwr----owner-member] PASSED [ 85%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwr----member-admin] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwr----member-owner] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwr----member-member] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwra---admin-admin] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwra---admin-owner] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwra---admin-member] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwra---owner-admin] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwra---owner-owner] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwra---owner-member] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwra---member-admin] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwra---member-owner] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwra---member-member] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwrw---admin-admin] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwrw---admin-owner] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwrw---admin-member] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwrw---owner-admin] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwrw---owner-owner] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwrw---owner-member] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwrw---member-admin] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwrw---member-owner] PASSED [ 86%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissionsWorkaround[rwrw---member-member] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rw-----admin-admin] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rw-----admin-owner] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rw-----admin-member] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rw-----owner-admin] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rw-----owner-owner] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rw-----owner-member] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rw-----member-admin] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rw-----member-owner] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rw-----member-member] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwr----admin-admin] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwr----admin-owner] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwr----admin-member] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwr----owner-admin] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwr----owner-owner] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwr----owner-member] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwr----member-admin] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwr----member-owner] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwr----member-member] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwra---admin-admin] PASSED [ 87%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwra---admin-owner] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwra---admin-member] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwra---owner-admin] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwra---owner-owner] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwra---owner-member] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwra---member-admin] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwra---member-owner] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwra---member-member] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwrw---admin-admin] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwrw---admin-owner] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwrw---admin-member] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwrw---owner-admin] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwrw---owner-owner] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwrw---owner-member] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwrw---member-admin] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwrw---member-owner] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testProjectionPermissions[rwrw---member-member] PASSED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testExtendedRestrictions[PlateI] SKIPPED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testExtendedRestrictions[WellI] SKIPPED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testExtendedRestrictions[ImageI] SKIPPED [ 88%] test/integration/test_permissions.py::TestPermissionProjections::testExtendedRestrictions[FileAnnotationI] SKIPPED [ 88%] test/integration/test_pixelsService.py::TestPixelsService::test9655 PASSED [ 89%] test/integration/test_rawfilestore.py::TestRFS::testNullSize11743 PASSED [ 89%] test/integration/test_rawfilestore.py::TestRFS::testGetFileId PASSED [ 89%] test/integration/test_rawpixelsstore.py::TestRPS::testTicket4737WithClose PASSED [ 89%] test/integration/test_rawpixelsstore.py::TestRPS::testTicket4737WithSave PASSED [ 89%] test/integration/test_rawpixelsstore.py::TestRPS::testTicket4737WithForEachTile PASSED [ 89%] test/integration/test_rawpixelsstore.py::TestRPS::testBigPlane PASSED [ 89%] test/integration/test_rawpixelsstore.py::TestRPS::testRomioToPyramid PASSED [ 89%] test/integration/test_rawpixelsstore.py::TestRPS::test2RomioToPyramidWithNegOne PASSED [ 89%] test/integration/test_rawpixelsstore.py::TestRPS::testPyramidConcurrentAccess PASSED [ 89%] test/integration/test_rawpixelsstore.py::TestTiles::testTiles PASSED [ 89%] test/integration/test_reimport.py::TestReimportArchivedFiles::testConvertSynthetic PASSED [ 89%] test/integration/test_render.py::TestRendering::test_render_region PASSED [ 89%] test/integration/test_reporawfilestore.py::TestRepoRawFileStore::test_dir PASSED [ 89%] test/integration/test_reporawfilestore.py::TestRepoRawFileStore::testCreate PASSED [ 89%] test/integration/test_reporawfilestore.py::TestRepoRawFileStore::testWrite PASSED [ 89%] test/integration/test_reporawfilestore.py::TestRepoRawFileStore::testFailedWrite PASSED [ 89%] test/integration/test_reporawfilestore.py::TestRepoRawFileStore::testWriteRead PASSED [ 89%] test/integration/test_reporawfilestore.py::TestRepoRawFileStore::testAppend PASSED [ 89%] test/integration/test_reporawfilestore.py::TestRepoRawFileStore::testTruncateToZero PASSED [ 89%] test/integration/test_reporawfilestore.py::TestRepoRawFileStore::testClose PASSED [ 90%] test/integration/test_reporawfilestore.py::TestRepoRawFileStore::testImportLogFilenameSetting PASSED [ 90%] test/integration/test_repository.py::TestRepository::test_dir PASSED [ 90%] test/integration/test_repository.py::TestRepository::testBasicUsage PASSED [ 90%] test/integration/test_repository.py::TestRepository::testSanityCheckRepos PASSED [ 90%] test/integration/test_repository.py::TestRepository::testManagedRepoAsPubliRepo PASSED [ 90%] test/integration/test_repository.py::TestFileExists::test_dir PASSED [ 90%] test/integration/test_repository.py::TestFileExists::testFileExistsForDirectory PASSED [ 90%] test/integration/test_repository.py::TestFileExists::testFileExistsForFile PASSED [ 90%] test/integration/test_repository.py::TestManagedRepositoryMultiUser::test_dir PASSED [ 90%] test/integration/test_repository.py::TestManagedRepositoryMultiUser::testTopPrivateGroup PASSED [ 90%] test/integration/test_repository.py::TestManagedRepositoryMultiUser::testDirPrivateGroup PASSED [ 90%] test/integration/test_repository.py::TestManagedRepositoryMultiUser::testDirReadOnlyGroup PASSED [ 90%] test/integration/test_repository.py::TestManagedRepositoryMultiUser::testDirReadWriteGroup PASSED [ 90%] test/integration/test_repository.py::TestManagedRepositoryMultiUser::testDirReadAnnotateGroup PASSED [ 90%] test/integration/test_repository.py::TestManagedRepositoryMultiUser::testMultiGroup PASSED [ 90%] test/integration/test_repository.py::TestPythonImporter::test_dir PASSED [ 90%] test/integration/test_repository.py::TestPythonImporter::testImportFileset PASSED [ 90%] test/integration/test_repository.py::TestPythonImporter::testImportPaths PASSED [ 90%] test/integration/test_repository.py::TestPythonImporter::testReopenRawFileStoresPR2542 PASSED [ 90%] test/integration/test_repository.py::TestPythonImporter::testImportsFrom2Groups PASSED [ 91%] test/integration/test_repository.py::TestRawAccess::test_dir PASSED [ 91%] test/integration/test_repository.py::TestRawAccess::testAsNonAdmin PASSED [ 91%] test/integration/test_repository.py::TestRawAccess::testAsAdmin PASSED [ 91%] test/integration/test_repository.py::TestDbSync::test_dir PASSED [ 91%] test/integration/test_repository.py::TestDbSync::testMtime PASSED [ 91%] test/integration/test_repository.py::TestDbSync::testFileExists PASSED [ 91%] test/integration/test_repository.py::TestDbSync::testNonDbFileNotReturned PASSED [ 91%] test/integration/test_repository.py::TestRecursiveDelete::test_dir PASSED [ 91%] test/integration/test_repository.py::TestRecursiveDelete::testCmdDeleteCantDeleteDirectories PASSED [ 91%] test/integration/test_repository.py::TestRecursiveDelete::testRecursiveDeleteMethodAvailable PASSED [ 91%] test/integration/test_repository.py::TestRecursiveDelete::testDoubleDot PASSED [ 91%] test/integration/test_repository.py::TestDeleteLog::test_dir PASSED [ 91%] test/integration/test_repository.py::TestDeleteLog::testSimpleDelete PASSED [ 91%] test/integration/test_repository.py::TestUserTemplate::test_dir PASSED [ 91%] test/integration/test_repository.py::TestUserTemplate::testCreateUuidFails PASSED [ 91%] test/integration/test_repository.py::TestUserTemplate::testCreateUserDirPasses PASSED [ 91%] test/integration/test_repository.py::TestUserTemplate::testCreateUuidUnderUserDirPasses PASSED [ 91%] test/integration/test_repository.py::TestUserTemplate::testUserDirShouldBeGloballyWriteable PASSED [ 91%] test/integration/test_repository.py::TestFilesetQueries::test_dir PASSED [ 91%] test/integration/test_repository.py::TestFilesetQueries::testDeleteQuery PASSED [ 92%] test/integration/test_repository.py::TestFilesetQueries::testCountFilesetFiles PASSED [ 92%] test/integration/test_repository.py::TestFilesetQueries::testImportedImageFiles PASSED [ 92%] test/integration/test_repository.py::TestOriginalMetadata::test_dir PASSED [ 92%] test/integration/test_repository.py::TestOriginalMetadata::testFakeImport PASSED [ 92%] test/integration/test_repository.py::TestDeletePerformance::test_dir PASSED [ 92%] test/integration/test_repository.py::TestDeletePerformance::testImport PASSED [ 92%] test/integration/test_rois.py::TestRois::teststats1 PASSED [ 92%] test/integration/test_rois.py::TestRois::test3703 PASSED [ 92%] test/integration/test_rois.py::TestRois::testGetROICount PASSED [ 92%] test/integration/test_rois.py::TestRois::test8990 PASSED [ 92%] test/integration/test_rois.py::TestRois::testShapeColors[color0] PASSED [ 92%] test/integration/test_rois.py::TestRois::testShapeColors[color1] PASSED [ 92%] test/integration/test_rois.py::TestRois::testShapeColors[color2] PASSED [ 92%] test/integration/test_rois.py::TestRois::testShapeColors[color3] PASSED [ 92%] test/integration/test_rois.py::TestRois::testShapeColors[color4] PASSED [ 92%] test/integration/test_rois.py::TestRois::testShapeColors[color5] PASSED [ 92%] test/integration/test_rois.py::TestRois::testShapeColors[color6] PASSED [ 92%] test/integration/test_rois.py::TestRois::testShapeColors[color7] PASSED [ 92%] test/integration/test_rois.py::TestRois::testShapeColors[color8] PASSED [ 92%] test/integration/test_rois.py::TestRois::testShapeColors[color9] PASSED [ 92%] test/integration/test_rois.py::TestRois::testGetShapeStatsRestricted PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testBasicUsage PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testTicket1036 PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testUploadAndPing PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testUpload2562 PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testDelete6905 PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testDelete11371 PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testUploadOfficialScript PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testRunScript PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testEditScript PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testScriptValidation PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testSpeedOfThumbnailFigure PASSED [ 93%] test/integration/test_scripts.py::TestScripts::test6066 PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testDynamicTime PASSED [ 93%] test/integration/test_scripts.py::TestScripts::testDynamicUpdate PASSED [ 93%] test/integration/test_search.py::TestSearch::test2541 PASSED [ 93%] test/integration/test_search.py::TestSearch::test3164Private PASSED [ 93%] test/integration/test_search.py::TestSearch::test3164ReadOnlySelf PASSED [ 93%] test/integration/test_search.py::TestSearch::test3164ReadOnlyOther PASSED [ 93%] test/integration/test_search.py::TestSearch::test3164CollabSelf PASSED [ 93%] test/integration/test_search.py::TestSearch::test3164CollabOther PASSED [ 94%] test/integration/test_search.py::TestSearch::test3721Ordering PASSED [ 94%] test/integration/test_search.py::TestSearch::test8692 PASSED [ 94%] test/integration/test_search.py::TestSearch::test8846 PASSED [ 94%] test/integration/test_search.py::TestSearch::testClientPath PASSED [ 94%] test/integration/test_search.py::TestSearch::testFilename PASSED [ 94%] test/integration/test_search.py::TestSearch::test_csv_attachment PASSED [ 94%] test/integration/test_search.py::TestSearch::test_txt_attachment PASSED [ 94%] test/integration/test_search.py::TestSearch::test_word_portions PASSED [ 94%] test/integration/test_search.py::TestSearch::test_empty_query_string PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very-small-very small] PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very-small-very-small] PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very-small-very_small] PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very-small-small very] PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very_small-very small] PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very_small-very-small] PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very_small-very_small] PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very_small-small very] PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very small-very small] PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very small-very-small] PASSED [ 94%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very small-very_small] PASSED [ 95%] test/integration/test_search.py::TestSearch::test_hyphen_underscore[very small-small very] PASSED [ 95%] test/integration/test_search.py::TestSearch::test_map_annotations PASSED [ 95%] test/integration/test_simple.py::TestSimple::testUserId PASSED [ 95%] test/integration/test_simple.py::TestSimple::testGroupId PASSED [ 95%] test/integration/test_simple.py::TestSimple::testGroupPermissions PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testThumbs PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test9070 PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testPrivate10618RootWithGrpCtx PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testPrivate10618RootWithGrpCtxButNoLoad PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testReadOnly10618RootWithGrpCtx PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testReadOnly10618RootWithGrpCtxButNoLoad PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testReadOnly10618MemberWithGrpCtxButNoLoad PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsThumbs PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsThumbsReadOnly PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readOnly-saveCurrent] PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readOnly-saveAs] PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readOnly-request] PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readOnly-resetDefault] PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readOnly-resetDefaultNoSave] PASSED [ 95%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readAnnotate-saveCurrent] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readAnnotate-saveAs] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readAnnotate-request] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readAnnotate-resetDefault] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readAnnotate-resetDefaultNoSave] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readWrite-saveCurrent] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readWrite-saveAs] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readWrite-request] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readWrite-resetDefault] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[owner-readWrite-resetDefaultNoSave] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readOnly-saveCurrent] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readOnly-saveAs] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readOnly-request] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readOnly-resetDefault] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readOnly-resetDefaultNoSave] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readAnnotate-saveCurrent] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readAnnotate-saveAs] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readAnnotate-request] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readAnnotate-resetDefault] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readAnnotate-resetDefaultNoSave] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readWrite-saveCurrent] PASSED [ 96%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readWrite-saveAs] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readWrite-request] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readWrite-resetDefault] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsRnd[admin-readWrite-resetDefaultNoSave] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsGetThumbnail[readOnly-owner] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsGetThumbnail[readOnly-admin] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsGetThumbnail[readAnnotate-owner] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsGetThumbnail[readAnnotate-admin] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsGetThumbnail[readWrite-owner] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsGetThumbnail[readWrite-admin] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsSetRnd[readOnly-owner] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsSetRnd[readOnly-admin] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsSetRnd[readAnnotate-owner] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsSetRnd[readAnnotate-admin] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsSetRnd[readWrite-owner] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::test12145ShareSettingsSetRnd[readWrite-admin] PASSED [ 97%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testGetThumbnailSetAfterApplySettings PASSED [ 97%] test/integration/test_thumbs.py::TestThumbs::testCreateThumbnails PASSED [ 97%] test/integration/test_thumbs.py::TestThumbs::testCreateThumbnails64x64 PASSED [ 97%] test/integration/test_thumbs.py::TestThumbs::testCreateThumbnailsByLongestSideSet64x64 PASSED [ 97%] test/integration/test_thumbs.py::TestThumbs::testThumbnailExists PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testThumbnailVersion[one] PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testThumbnailVersion[set] PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testGetThumbnailx64x64 PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testGetThumbnailByLongestSidex64 PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testGetThumbnailDirectx64x64 PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testGetThumbnailForSectionDirectx0x0x64x64 PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testGetThumbnailx64x60 PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testGetThumbnailByLongestSidex60 PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testGetThumbnailDirectx64x60 PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testGetThumbnailForSectionDirectx0x0x64x60 PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testGetThumbnailSetx64x64 PASSED [ 98%] test/integration/test_thumbs.py::TestThumbs::testGetThumbnailByLongestSideSetx64 PASSED [ 98%] test/integration/test_tickets1000.py::TestTicket1000::test711 PASSED [ 98%] test/integration/test_tickets1000.py::TestTicket1000::test843 PASSED [ 98%] test/integration/test_tickets1000.py::TestTicket1000::test880 PASSED [ 98%] test/integration/test_tickets1000.py::TestTicket1000::test883WithoutClose PASSED [ 98%] test/integration/test_tickets1000.py::TestTicket1000::test883WithClose PASSED [ 98%] test/integration/test_tickets1000.py::TestTicket1000::test883Upload PASSED [ 98%] test/integration/test_tickets1000.py::TestTicket1000::test985 PASSED [ 98%] test/integration/test_tickets2000.py::TestTickets2000::test1064 PASSED [ 99%] test/integration/test_tickets2000.py::TestTickets2000::test1067 PASSED [ 99%] test/integration/test_tickets2000.py::TestTickets2000::test1027 PASSED [ 99%] test/integration/test_tickets2000.py::TestTickets2000::test1069 PASSED [ 99%] test/integration/test_tickets2000.py::TestTickets2000::test1071 PASSED [ 99%] test/integration/test_tickets2000.py::TestTickets2000::test1071_1 PASSED [ 99%] test/integration/test_tickets2000.py::TestTickets2000::test1072 PASSED [ 99%] test/integration/test_tickets2000.py::TestTickets2000::test1088 PASSED [ 99%] test/integration/test_tickets2000.py::TestTickets2000::test1109 PASSED [ 99%] test/integration/test_tickets2000.py::TestTickets2000::test1163 PASSED [ 99%] test/integration/test_tickets2000.py::TestTickets2000::test1183 PASSED [ 99%] test/integration/test_tickets3000.py::TestTickets3000::test2396 PASSED [ 99%] test/integration/test_tickets3000.py::TestTickets3000::test2547 PASSED [ 99%] test/integration/test_tickets3000.py::TestTickets3000::test2628 PASSED [ 99%] test/integration/test_tickets3000.py::TestTickets3000::test2952 PASSED [ 99%] test/integration/test_tickets3000.py::TestTickets3000::test2762 PASSED [ 99%] test/integration/test_tickets4000.py::TestTickets4000::testChangeActiveGroup PASSED [ 99%] test/integration/test_tickets4000.py::TestTickets4000::testChangeActiveGroupWhenConnectionLost PASSED [ 99%] test/integration/test_tickets4000.py::TestTickets4000::test3201 PASSED [ 99%] test/integration/test_tickets4000.py::TestTickets4000::test3131 PASSED [ 99%] test/integration/test_util.py::TestUpgradeCheck::testReal FAILED [100%] ==================================== ERRORS ==================================== ______________ ERROR at setup of TestConnectionMethods.testClose _______________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:28:25,056 INFO [ omero.gateway] (MainThread) created connection (uuid=c0aef918-8bf0-4ace-bb07-9544faee9b16) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=c0aef918-8bf0-4ace-bb07-9544faee9b16) _________ ERROR at setup of TestConnectionMethods.testTopLevelObjects __________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:30:34,380 INFO [ omero.gateway] (MainThread) created connection (uuid=e280688d-a79f-4cec-8213-9ef307f1e42b) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=e280688d-a79f-4cec-8213-9ef307f1e42b) ____________ ERROR at setup of TestFindObject.testFindExperimenter _____________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg_tiny(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTinyTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:161: in getTinyTestImage return dbhelpers.getImage(self.gateway, 'tinyimg', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:33:01,235 INFO [ omero.gateway] (MainThread) created connection (uuid=a2a3234b-09dd-4efe-b798-9452622ebc0d) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=a2a3234b-09dd-4efe-b798-9452622ebc0d) ______________ ERROR at setup of TestGetObject.testGetAnnotations ______________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg_tiny(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTinyTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:161: in getTinyTestImage return dbhelpers.getImage(self.gateway, 'tinyimg', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:35:38,675 INFO [ omero.gateway] (MainThread) created connection (uuid=eff95d22-c84a-4f41-8431-b38987a8b34f) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=eff95d22-c84a-4f41-8431-b38987a8b34f) _________________ ERROR at setup of TestGetObject.testGetImage _________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg_tiny(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTinyTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:161: in getTinyTestImage return dbhelpers.getImage(self.gateway, 'tinyimg', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:37:48,829 INFO [ omero.gateway] (MainThread) created connection (uuid=6da27b96-1072-4c32-bc59-d4402a26177c) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=6da27b96-1072-4c32-bc59-d4402a26177c) ______ ERROR at setup of TestGetObject.testGetImageLoadPixels[True-True] _______ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg_tiny(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTinyTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:161: in getTinyTestImage return dbhelpers.getImage(self.gateway, 'tinyimg', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:39:59,338 INFO [ omero.gateway] (MainThread) created connection (uuid=0a3bbbb0-0bc9-41d1-9cc1-0c31bfaade24) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=0a3bbbb0-0bc9-41d1-9cc1-0c31bfaade24) ______ ERROR at setup of TestGetObject.testGetImageLoadPixels[True-False] ______ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg_tiny(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTinyTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:161: in getTinyTestImage return dbhelpers.getImage(self.gateway, 'tinyimg', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:42:10,428 INFO [ omero.gateway] (MainThread) created connection (uuid=c7318f46-f96e-4467-a2d3-816abd46473c) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=c7318f46-f96e-4467-a2d3-816abd46473c) ______ ERROR at setup of TestGetObject.testGetImageLoadPixels[False-True] ______ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg_tiny(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTinyTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:161: in getTinyTestImage return dbhelpers.getImage(self.gateway, 'tinyimg', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:44:21,478 INFO [ omero.gateway] (MainThread) created connection (uuid=747652cb-2ae8-4d07-82b2-2b55f1f20aa5) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=747652cb-2ae8-4d07-82b2-2b55f1f20aa5) _____ ERROR at setup of TestGetObject.testGetImageLoadPixels[False-False] ______ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg_tiny(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTinyTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:161: in getTinyTestImage return dbhelpers.getImage(self.gateway, 'tinyimg', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:46:32,567 INFO [ omero.gateway] (MainThread) created connection (uuid=efc58809-5c94-4bc7-ab11-68e79b0cea9b) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=efc58809-5c94-4bc7-ab11-68e79b0cea9b) ________________ ERROR at setup of TestGetObject.testTraversal _________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg_tiny(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTinyTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:61: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:161: in getTinyTestImage return dbhelpers.getImage(self.gateway, 'tinyimg', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:48:43,664 INFO [ omero.gateway] (MainThread) created connection (uuid=8a9826e2-9af9-4308-af24-5a264c1e3e59) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=8a9826e2-9af9-4308-af24-5a264c1e3e59) __________________ ERROR at setup of TestImage.testThumbnail ___________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:51:16,401 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:16,407 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:16,446 INFO [ omero.gateway] (MainThread) created connection (uuid=dd643949-0d17-4e49-87ff-b2c5619e6381) 2024-10-24 04:51:16,904 INFO [ omero.gateway] (MainThread) created connection (uuid=c9b28279-b618-4eb8-a6a4-b1ff778c9b91) 2024-10-24 04:51:16,940 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:17,007 INFO [ omero.gateway] (MainThread) created connection (uuid=9a70eb72-4065-4736-8f92-cd0ebd992d33) 2024-10-24 04:51:17,043 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:17,081 INFO [ omero.gateway] (MainThread) created connection (uuid=32f168b7-ed5a-4b55-9548-66752a25798c) 2024-10-24 04:51:17,125 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:17,164 INFO [ omero.gateway] (MainThread) created connection (uuid=8edab2fc-5554-4109-91b6-ec47155bae61) 2024-10-24 04:51:17,203 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:17,241 INFO [ omero.gateway] (MainThread) created connection (uuid=7e33b4ec-aec6-4f15-942e-fa61bf426720) 2024-10-24 04:51:17,278 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:17,316 INFO [ omero.gateway] (MainThread) created connection (uuid=bf3cf60a-4575-4d96-86b2-b6f3c616fd33) 2024-10-24 04:51:17,353 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:17,392 INFO [ omero.gateway] (MainThread) created connection (uuid=51e3b34c-2d8f-4396-a50e-72554b0a1238) 2024-10-24 04:51:17,439 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:17,476 INFO [ omero.gateway] (MainThread) created connection (uuid=d2a10ca0-557e-423d-aee4-2b99eb2fc012) 2024-10-24 04:51:17,525 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:17,562 INFO [ omero.gateway] (MainThread) created connection (uuid=69811a9e-7a7b-4907-a611-679bda706805) 2024-10-24 04:51:17,607 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:17,611 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:51:17,649 INFO [ omero.gateway] (MainThread) created connection (uuid=cb774f86-4871-4c30-99e0-7103123ed12d) 2024-10-24 04:51:17,717 INFO [ omero.gateway] (MainThread) created connection (uuid=6a45c096-94ca-44ce-8c91-6c11667b1ce8) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=dd643949-0d17-4e49-87ff-b2c5619e6381) INFO omero.gateway:__init__.py:2243 created connection (uuid=c9b28279-b618-4eb8-a6a4-b1ff778c9b91) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=9a70eb72-4065-4736-8f92-cd0ebd992d33) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=32f168b7-ed5a-4b55-9548-66752a25798c) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=8edab2fc-5554-4109-91b6-ec47155bae61) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=7e33b4ec-aec6-4f15-942e-fa61bf426720) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=bf3cf60a-4575-4d96-86b2-b6f3c616fd33) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=51e3b34c-2d8f-4396-a50e-72554b0a1238) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=d2a10ca0-557e-423d-aee4-2b99eb2fc012) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=69811a9e-7a7b-4907-a611-679bda706805) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=cb774f86-4871-4c30-99e0-7103123ed12d) INFO omero.gateway:__init__.py:2243 created connection (uuid=6a45c096-94ca-44ce-8c91-6c11667b1ce8) _________________ ERROR at setup of TestImage.testThumbnailSet _________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:53:26,738 INFO [ omero.gateway] (MainThread) created connection (uuid=c20a2195-ec3c-491f-9872-28a51cd1671e) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=c20a2195-ec3c-491f-9872-28a51cd1671e) _______________ ERROR at setup of TestImage.testRenderingModels ________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:55:37,314 INFO [ omero.gateway] (MainThread) created connection (uuid=9e967b96-ead3-40cd-ae7b-a67df863e1f0) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=9e967b96-ead3-40cd-ae7b-a67df863e1f0) _________________ ERROR at setup of TestImage.testSplitChannel _________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:57:48,398 INFO [ omero.gateway] (MainThread) created connection (uuid=f6c0d3e5-9409-45a2-970a-fadf74b044b5) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=f6c0d3e5-9409-45a2-970a-fadf74b044b5) __________________ ERROR at setup of TestImage.testLinePlots ___________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 04:59:59,459 INFO [ omero.gateway] (MainThread) created connection (uuid=69b98b7b-d463-4623-a765-06548264689d) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=69b98b7b-d463-4623-a765-06548264689d) _________________ ERROR at setup of TestImage.testProjections __________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:02:10,526 INFO [ omero.gateway] (MainThread) created connection (uuid=ae30536e-8aa3-4e6f-b112-4688057b39d3) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=ae30536e-8aa3-4e6f-b112-4688057b39d3) __________________ ERROR at setup of TestImage.testProperties __________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:04:21,610 INFO [ omero.gateway] (MainThread) created connection (uuid=e91b1503-f5f8-48fe-a9ec-012de08bf428) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=e91b1503-f5f8-48fe-a9ec-012de08bf428) ________________ ERROR at setup of TestImage.testPixelSizeUnits ________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:06:32,746 INFO [ omero.gateway] (MainThread) created connection (uuid=9cb7a4e0-bd49-4417-8c9c-e863ae0ec13c) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=9cb7a4e0-bd49-4417-8c9c-e863ae0ec13c) ________________ ERROR at setup of TestImage.testUnitsGetValue _________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:08:43,764 INFO [ omero.gateway] (MainThread) created connection (uuid=9dff15a5-62d3-4351-a3cc-5c8d289381bc) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=9dff15a5-62d3-4351-a3cc-5c8d289381bc) ____________ ERROR at setup of TestImage.testChannelWavelengthUnits ____________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:10:55,201 INFO [ omero.gateway] (MainThread) created connection (uuid=7832efeb-0aef-4af1-a528-c1884764e735) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=7832efeb-0aef-4af1-a528-c1884764e735) ______________ ERROR at setup of TestImage.testExposureTimeUnits _______________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:13:05,880 INFO [ omero.gateway] (MainThread) created connection (uuid=542b4009-14e6-4455-9e7b-8a4f367c653c) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=542b4009-14e6-4455-9e7b-8a4f367c653c) __________________ ERROR at setup of TestImage.testShortname ___________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:15:16,954 INFO [ omero.gateway] (MainThread) created connection (uuid=314967b9-33c8-4371-8cff-b55386d7fe57) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=314967b9-33c8-4371-8cff-b55386d7fe57) __________________ ERROR at setup of TestImage.testImageDate ___________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:17:28,032 INFO [ omero.gateway] (MainThread) created connection (uuid=4fee3276-8e7c-4975-83ef-a31cf62b0820) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=4fee3276-8e7c-4975-83ef-a31cf62b0820) ________________ ERROR at setup of TestImage.testSimpleMarshal _________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:19:39,158 INFO [ omero.gateway] (MainThread) created connection (uuid=6afce519-9be3-4941-982e-56b631ec4172) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=6afce519-9be3-4941-982e-56b631ec4172) ____________________ ERROR at setup of TestImage.testExport ____________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:21:50,171 INFO [ omero.gateway] (MainThread) created connection (uuid=3d2a8a72-46ab-4577-a84a-4c309f303821) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=3d2a8a72-46ab-4577-a84a-4c309f303821) _______________ ERROR at setup of TestImage.testRenderJpegRegion _______________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:24:01,262 INFO [ omero.gateway] (MainThread) created connection (uuid=c450da2c-0d55-42e0-ac16-bca27421fc42) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=c450da2c-0d55-42e0-ac16-bca27421fc42) _________ ERROR at setup of TestImage.testRenderJpegRegion_resolution __________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:26:12,317 INFO [ omero.gateway] (MainThread) created connection (uuid=9bf0b736-c53d-46df-8709-fc5d718dcf0b) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=9bf0b736-c53d-46df-8709-fc5d718dcf0b) _____ ERROR at setup of TestImage.testRenderJpegRegion_invalid_resolution ______ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:28:23,736 INFO [ omero.gateway] (MainThread) created connection (uuid=047fa684-029e-43a1-a4bc-a1ef0f31b380) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=047fa684-029e-43a1-a4bc-a1ef0f31b380) ______________ ERROR at setup of TestImage.testRenderBirdsEyeView ______________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:30:34,475 INFO [ omero.gateway] (MainThread) created connection (uuid=3a9df39a-49de-46c6-badc-45ec9e9e141d) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=3a9df39a-49de-46c6-badc-45ec9e9e141d) ___________ ERROR at setup of TestImage.testRenderBirdsEyeView_Size ____________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:32:45,553 INFO [ omero.gateway] (MainThread) created connection (uuid=062bee96-9422-425f-b27c-f0f5b3fb33ea) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=062bee96-9422-425f-b27c-f0f5b3fb33ea) _____________ ERROR at setup of TestPixels.testReuseRawPixelsStore _____________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:35:14,387 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:14,393 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:14,432 INFO [ omero.gateway] (MainThread) created connection (uuid=892f4a8c-9732-4477-957c-fc9c91897b90) 2024-10-24 05:35:14,909 INFO [ omero.gateway] (MainThread) created connection (uuid=138ea039-f2b4-43bd-b680-bd6454fe69c0) 2024-10-24 05:35:14,944 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:14,983 INFO [ omero.gateway] (MainThread) created connection (uuid=fbbef218-4b76-48ce-80bf-0fc5e7efedb1) 2024-10-24 05:35:15,019 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:15,059 INFO [ omero.gateway] (MainThread) created connection (uuid=3a3dc5a8-0d1a-430a-a9f2-fb8b39e2bbad) 2024-10-24 05:35:15,100 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:15,139 INFO [ omero.gateway] (MainThread) created connection (uuid=4fb34144-437c-49ac-925c-94bd31644285) 2024-10-24 05:35:15,182 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:15,222 INFO [ omero.gateway] (MainThread) created connection (uuid=56a5392d-79d9-404b-bfb3-042574b25d0e) 2024-10-24 05:35:15,262 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:15,300 INFO [ omero.gateway] (MainThread) created connection (uuid=61ac66ec-aa45-4bb6-ba62-a430a74a3d2c) 2024-10-24 05:35:15,339 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:15,377 INFO [ omero.gateway] (MainThread) created connection (uuid=a518dbb2-7086-4997-9412-8bd0509794e6) 2024-10-24 05:35:15,425 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:15,464 INFO [ omero.gateway] (MainThread) created connection (uuid=d72f543a-59d7-4d51-aaed-fe5929f76bc4) 2024-10-24 05:35:15,514 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:15,551 INFO [ omero.gateway] (MainThread) created connection (uuid=8f1cdd48-df33-441e-af92-8b519832e6fe) 2024-10-24 05:35:15,595 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:15,598 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:35:15,635 INFO [ omero.gateway] (MainThread) created connection (uuid=d1f04a2e-6b1c-4dc2-b85f-0d440ed47ba2) 2024-10-24 05:35:15,705 INFO [ omero.gateway] (MainThread) created connection (uuid=c956c467-7396-4be2-8e05-3a59148a1d71) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=892f4a8c-9732-4477-957c-fc9c91897b90) INFO omero.gateway:__init__.py:2243 created connection (uuid=138ea039-f2b4-43bd-b680-bd6454fe69c0) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=fbbef218-4b76-48ce-80bf-0fc5e7efedb1) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=3a3dc5a8-0d1a-430a-a9f2-fb8b39e2bbad) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=4fb34144-437c-49ac-925c-94bd31644285) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=56a5392d-79d9-404b-bfb3-042574b25d0e) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=61ac66ec-aa45-4bb6-ba62-a430a74a3d2c) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=a518dbb2-7086-4997-9412-8bd0509794e6) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=d72f543a-59d7-4d51-aaed-fe5929f76bc4) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=8f1cdd48-df33-441e-af92-8b519832e6fe) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=d1f04a2e-6b1c-4dc2-b85f-0d440ed47ba2) INFO omero.gateway:__init__.py:2243 created connection (uuid=c956c467-7396-4be2-8e05-3a59148a1d71) __________________ ERROR at setup of TestPixels.testPlaneInfo __________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:37:24,669 INFO [ omero.gateway] (MainThread) created connection (uuid=006958ad-cf39-4037-9c8e-3b9cb911de0d) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=006958ad-cf39-4037-9c8e-3b9cb911de0d) _________________ ERROR at setup of TestPixels.testPixelsType __________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:39:35,196 INFO [ omero.gateway] (MainThread) created connection (uuid=4c5358c5-c12b-4096-9847-ca81352469d5) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=4c5358c5-c12b-4096-9847-ca81352469d5) ___________________ ERROR at setup of TestPixels.testGetTile ___________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:41:46,309 INFO [ omero.gateway] (MainThread) created connection (uuid=178db830-a5ce-4512-9822-4d6d599f3220) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=178db830-a5ce-4512-9822-4d6d599f3220) __________________ ERROR at setup of TestPixels.testGetPlane ___________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:43:57,291 INFO [ omero.gateway] (MainThread) created connection (uuid=6f11b22e-dd46-4677-832d-1f9bed169276) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=6f11b22e-dd46-4677-832d-1f9bed169276) ________ ERROR at setup of TestPixels.testGetPlanesExceptionOnGetPlane _________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:46:08,382 INFO [ omero.gateway] (MainThread) created connection (uuid=903d25d0-5060-42f7-9302-b95e8a058ac3) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=903d25d0-5060-42f7-9302-b95e8a058ac3) __________ ERROR at setup of TestPixels.testGetPlanesExceptionOnClose __________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:48:19,418 INFO [ omero.gateway] (MainThread) created connection (uuid=0e3a6664-f946-4094-a287-4582f31e7c37) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=0e3a6664-f946-4094-a287-4582f31e7c37) __________ ERROR at setup of TestPixels.testGetPlanesExceptionOnBoth ___________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:50:30,515 INFO [ omero.gateway] (MainThread) created connection (uuid=d440c3d6-b9a1-4a22-bcb7-20696560b622) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=d440c3d6-b9a1-4a22-bcb7-20696560b622) ________________ ERROR at setup of TestPixels.testGetHistogram _________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:52:42,162 INFO [ omero.gateway] (MainThread) created connection (uuid=0bb54872-4eab-4081-a7be-8fa31840bf56) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=0bb54872-4eab-4081-a7be-8fa31840bf56) ___________________ ERROR at setup of TestRDefs.testDefault ____________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:54:56,373 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:56,381 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:56,426 INFO [ omero.gateway] (MainThread) created connection (uuid=2f5ce2c1-dcdb-4657-bfce-489bb3c6782e) 2024-10-24 05:54:56,980 INFO [ omero.gateway] (MainThread) created connection (uuid=e6ace603-d3d9-4f17-a039-a56b2872a4a4) 2024-10-24 05:54:57,025 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:57,070 INFO [ omero.gateway] (MainThread) created connection (uuid=ed3b09d2-c6be-4122-9f15-6febd5d12f1e) 2024-10-24 05:54:57,117 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:57,161 INFO [ omero.gateway] (MainThread) created connection (uuid=5cd37530-1f62-4e6f-9487-b61e5365ea58) 2024-10-24 05:54:57,210 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:57,263 INFO [ omero.gateway] (MainThread) created connection (uuid=5257d7bc-81ca-4099-ad90-79c0203f39dc) 2024-10-24 05:54:57,315 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:57,359 INFO [ omero.gateway] (MainThread) created connection (uuid=6e69565f-d48b-4f6f-82b2-a0abf1717e49) 2024-10-24 05:54:57,406 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:57,454 INFO [ omero.gateway] (MainThread) created connection (uuid=305133fa-cddc-400f-a0da-c8eacfe065b5) 2024-10-24 05:54:57,691 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:57,738 INFO [ omero.gateway] (MainThread) created connection (uuid=a9967354-2e4c-40eb-ab73-211b67299164) 2024-10-24 05:54:57,799 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:57,848 INFO [ omero.gateway] (MainThread) created connection (uuid=3454d4c5-f931-499c-ac78-65769e2d5099) 2024-10-24 05:54:57,914 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:57,962 INFO [ omero.gateway] (MainThread) created connection (uuid=958b2547-35dd-4c8d-b9e7-65682cb4110e) 2024-10-24 05:54:58,024 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:58,030 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 05:54:58,077 INFO [ omero.gateway] (MainThread) created connection (uuid=73e3309f-7f04-4a0a-b99f-e8edd09b8b83) 2024-10-24 05:54:58,167 INFO [ omero.gateway] (MainThread) created connection (uuid=0bf040a7-7eef-4e9f-9e6d-2b7a7b27bb74) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=2f5ce2c1-dcdb-4657-bfce-489bb3c6782e) INFO omero.gateway:__init__.py:2243 created connection (uuid=e6ace603-d3d9-4f17-a039-a56b2872a4a4) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=ed3b09d2-c6be-4122-9f15-6febd5d12f1e) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=5cd37530-1f62-4e6f-9487-b61e5365ea58) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=5257d7bc-81ca-4099-ad90-79c0203f39dc) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=6e69565f-d48b-4f6f-82b2-a0abf1717e49) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=305133fa-cddc-400f-a0da-c8eacfe065b5) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=a9967354-2e4c-40eb-ab73-211b67299164) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=3454d4c5-f931-499c-ac78-65769e2d5099) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=958b2547-35dd-4c8d-b9e7-65682cb4110e) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=73e3309f-7f04-4a0a-b99f-e8edd09b8b83) INFO omero.gateway:__init__.py:2243 created connection (uuid=0bf040a7-7eef-4e9f-9e6d-2b7a7b27bb74) __________________ ERROR at setup of TestRDefs.testCustomized __________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:57:07,869 INFO [ omero.gateway] (MainThread) created connection (uuid=22168e23-32d2-4e55-acfb-28613a615536) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=22168e23-32d2-4e55-acfb-28613a615536) ________________ ERROR at setup of TestRDefs.testChannelWindows ________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 05:59:18,889 INFO [ omero.gateway] (MainThread) created connection (uuid=1d5b59c9-fc57-47f5-882f-ce509d9a416a) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=1d5b59c9-fc57-47f5-882f-ce509d9a416a) ______________ ERROR at setup of TestRDefs.testFloatDefaultMinMax ______________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:01:29,975 INFO [ omero.gateway] (MainThread) created connection (uuid=43a801fa-e332-4bc4-b0fe-c11ef2b598f0) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=43a801fa-e332-4bc4-b0fe-c11ef2b598f0) _________________ ERROR at setup of TestRDefs.testEmissionWave _________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:03:41,021 INFO [ omero.gateway] (MainThread) created connection (uuid=2411a4d0-eb06-41e1-aab5-30afde6b8dad) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=2411a4d0-eb06-41e1-aab5-30afde6b8dad) __________________ ERROR at setup of TestRDefs.testBatchCopy ___________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:05:52,112 INFO [ omero.gateway] (MainThread) created connection (uuid=783c2510-48e7-4059-92c1-b1444a6dbe81) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=783c2510-48e7-4059-92c1-b1444a6dbe81) ____________ ERROR at setup of TestRDefs.testGroupBasedPermissions _____________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:08:03,169 INFO [ omero.gateway] (MainThread) created connection (uuid=01b678a4-cb77-4b51-a4ba-10b8dacd5d3a) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=01b678a4-cb77-4b51-a4ba-10b8dacd5d3a) ___________________ ERROR at setup of TestRDefs.testGetRdefs ___________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:10:14,668 INFO [ omero.gateway] (MainThread) created connection (uuid=6813e06a-0a31-4f8d-bd64-1a23d7605574) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=6813e06a-0a31-4f8d-bd64-1a23d7605574) ________________ ERROR at setup of TestRDefs.testResetDefaults _________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:12:25,319 INFO [ omero.gateway] (MainThread) created connection (uuid=9c8910ae-fc5b-4dbe-b777-d9b877933445) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=9c8910ae-fc5b-4dbe-b777-d9b877933445) _____________ ERROR at setup of TestRDefs.testQuantizationSettings _____________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:14:36,400 INFO [ omero.gateway] (MainThread) created connection (uuid=115c1894-0c56-4a94-9887-c6e4acfc62ad) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=115c1894-0c56-4a94-9887-c6e4acfc62ad) _________ ERROR at setup of TestRDefs.testQuantizationSettingsInvalid __________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:16:47,462 INFO [ omero.gateway] (MainThread) created connection (uuid=aa0cf633-9051-4f62-82e5-7df7efc44cc5) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=aa0cf633-9051-4f62-82e5-7df7efc44cc5) ___________ ERROR at setup of TestRDefs.testQuantizationSettingsBulk ___________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:18:58,531 INFO [ omero.gateway] (MainThread) created connection (uuid=b4858ae8-b55c-4e1f-97e0-568c1be0e108) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=b4858ae8-b55c-4e1f-97e0-568c1be0e108) _______________ ERROR at setup of TestRDefs.testGetChannelsNoRE ________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:21:09,600 INFO [ omero.gateway] (MainThread) created connection (uuid=4348c77a-4fdc-4c81-82c1-2e7cf63457f9) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=4348c77a-4fdc-4c81-82c1-2e7cf63457f9) ____________ ERROR at setup of TestRDefs.testSetActiveChannelsNoRE _____________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:23:20,679 INFO [ omero.gateway] (MainThread) created connection (uuid=a9bee5fd-cd03-4298-a490-1635b671cf34) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=a9bee5fd-cd03-4298-a490-1635b671cf34) ___________ ERROR at setup of TestRDefs.testSetActiveChannelsWithRE ____________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:25:31,745 INFO [ omero.gateway] (MainThread) created connection (uuid=d7720781-b929-4cd0-8deb-de566b071082) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=d7720781-b929-4cd0-8deb-de566b071082) ___ ERROR at setup of TestRDefs.test_set_active_channels_set_inactive[True] ____ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:27:43,248 INFO [ omero.gateway] (MainThread) created connection (uuid=9cc432b6-2123-4912-9456-65bfaf355b9d) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=9cc432b6-2123-4912-9456-65bfaf355b9d) ___ ERROR at setup of TestRDefs.test_set_active_channels_set_inactive[False] ___ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:29:53,886 INFO [ omero.gateway] (MainThread) created connection (uuid=b293ad4e-f852-44a8-8451-ed1c1791ad92) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=b293ad4e-f852-44a8-8451-ed1c1791ad92) ______________ ERROR at setup of TestRDefs.testUnregisterService _______________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:32:04,972 INFO [ omero.gateway] (MainThread) created connection (uuid=ea907fa1-4985-4605-a33a-45e35712b383) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=ea907fa1-4985-4605-a33a-45e35712b383) _______________ ERROR at setup of TestRDefs.testRegisterService ________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:34:16,049 INFO [ omero.gateway] (MainThread) created connection (uuid=ff799cf0-9bce-4553-83eb-7dc1520e1427) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=ff799cf0-9bce-4553-83eb-7dc1520e1427) ___________________ ERROR at setup of TestRDefs.testCloseRE ____________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:36:27,100 INFO [ omero.gateway] (MainThread) created connection (uuid=1e34c257-dfce-44a6-8eb1-b4710dc3f7fc) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:2243 created connection (uuid=1e34c257-dfce-44a6-8eb1-b4710dc3f7fc) _____________ ERROR at setup of TestWrapper.testAllObjectsWrapped ______________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:42:37,188 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:37,193 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:37,236 INFO [ omero.gateway] (MainThread) created connection (uuid=e72c8115-5c87-4316-bf8c-35d724d9a77a) 2024-10-24 06:42:37,717 INFO [ omero.gateway] (MainThread) created connection (uuid=97632c00-f82f-46f7-ad65-82756e2953e0) 2024-10-24 06:42:37,753 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:37,790 INFO [ omero.gateway] (MainThread) created connection (uuid=7f716f6d-a52c-46cc-be74-21103ca1bc4d) 2024-10-24 06:42:37,825 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:37,864 INFO [ omero.gateway] (MainThread) created connection (uuid=622a2656-c27e-4295-a217-d74cf877910c) 2024-10-24 06:42:37,905 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:37,944 INFO [ omero.gateway] (MainThread) created connection (uuid=a80e5a19-b4db-43a4-a147-b8d3e0ebea65) 2024-10-24 06:42:37,987 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:38,027 INFO [ omero.gateway] (MainThread) created connection (uuid=0d4079bf-4b34-4fd8-9eb0-043a4591c7bc) 2024-10-24 06:42:38,069 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:38,107 INFO [ omero.gateway] (MainThread) created connection (uuid=5c9dae8f-fe67-4e45-a3ea-898d23781d76) 2024-10-24 06:42:38,148 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:38,188 INFO [ omero.gateway] (MainThread) created connection (uuid=a2090aec-190e-44de-babc-bc6a0cc4ec6e) 2024-10-24 06:42:38,240 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:38,284 INFO [ omero.gateway] (MainThread) created connection (uuid=d6b47a35-fd96-4eb6-a9f0-8c18f021d709) 2024-10-24 06:42:38,339 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:38,379 INFO [ omero.gateway] (MainThread) created connection (uuid=d5a05f63-00ac-4f95-b6e9-3b5e9f22b3ce) 2024-10-24 06:42:38,430 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:38,433 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:42:38,472 INFO [ omero.gateway] (MainThread) created connection (uuid=d83be2b2-eebe-4967-9f29-80a73dbff0d9) 2024-10-24 06:42:38,551 INFO [ omero.gateway] (MainThread) created connection (uuid=67345447-5634-4537-afe2-98af81859b4f) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=e72c8115-5c87-4316-bf8c-35d724d9a77a) INFO omero.gateway:__init__.py:2243 created connection (uuid=97632c00-f82f-46f7-ad65-82756e2953e0) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=7f716f6d-a52c-46cc-be74-21103ca1bc4d) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=622a2656-c27e-4295-a217-d74cf877910c) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=a80e5a19-b4db-43a4-a147-b8d3e0ebea65) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=0d4079bf-4b34-4fd8-9eb0-043a4591c7bc) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=5c9dae8f-fe67-4e45-a3ea-898d23781d76) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=a2090aec-190e-44de-babc-bc6a0cc4ec6e) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=d6b47a35-fd96-4eb6-a9f0-8c18f021d709) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=d5a05f63-00ac-4f95-b6e9-3b5e9f22b3ce) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=d83be2b2-eebe-4967-9f29-80a73dbff0d9) INFO omero.gateway:__init__.py:2243 created connection (uuid=67345447-5634-4537-afe2-98af81859b4f) _______________ ERROR at setup of TestWrapper.testDatasetWrapper _______________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:44:47,573 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:44:47,616 INFO [ omero.gateway] (MainThread) created connection (uuid=411e7536-cdb0-4765-b3b2-cad47a6d3cd0) 2024-10-24 06:44:47,696 INFO [ omero.gateway] (MainThread) created connection (uuid=bdc4ecbe-e5d6-449a-94b4-949540529338) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=411e7536-cdb0-4765-b3b2-cad47a6d3cd0) INFO omero.gateway:__init__.py:2243 created connection (uuid=bdc4ecbe-e5d6-449a-94b4-949540529338) _______________ ERROR at setup of TestWrapper.testDetailsWrapper _______________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: request = > gatewaywrapper = @pytest.fixture(scope='function') def author_testimg(request, gatewaywrapper): """ logs in as Author and returns the test image, creating it first if needed. """ gatewaywrapper.loginAsAuthor() > rv = gatewaywrapper.getTestImage(autocreate=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/pytest_fixtures.py:81: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ---------------------------- Captured stderr setup ----------------------------- 2024-10-24 06:46:57,898 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:46:57,942 INFO [ omero.gateway] (MainThread) created connection (uuid=80fa94d8-352c-4dca-b550-74b642031e68) 2024-10-24 06:46:58,025 INFO [ omero.gateway] (MainThread) created connection (uuid=10629aa4-b040-41bf-9a0f-e2b670b1144e) I< ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=80fa94d8-352c-4dca-b550-74b642031e68) INFO omero.gateway:__init__.py:2243 created connection (uuid=10629aa4-b040-41bf-9a0f-e2b670b1144e) =================================== FAILURES =================================== _________________________ TestAdmin.test_checkupgrade0 _________________________ self = monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x7f02617acfd0> def test_checkupgrade0(self, monkeypatch): monkeypatch.setattr(omero.plugins.prefs, "UpgradeCheck", createUpgradeCheckClass("999999999.0.0")) self.args.append("checkupgrade") > self.go() test/integration/clitest/test_admin.py:66: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/integration/clitest/test_admin.py:60: in go self.cli.invoke(self.args, strict=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/cli.py:1211: in invoke self.assertRC() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def assertRC(self): if self.rv != 0: > raise NonZeroReturnCode(self.rv, "assert failed") E omero.cli.NonZeroReturnCode: assert failed ../../../../.venv3/lib64/python3.9/site-packages/omero/cli.py:1200: NonZeroReturnCode ----------------------------- Captured stdout call ----------------------------- HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=999999999.0.0&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) ----------------------------- Captured stderr call ----------------------------- Error printing text ------------------------------ Captured log call ------------------------------- ERROR omero.util.UpgradeCheck:upgrade_check.py:141 HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=999999999.0.0&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) _________________________ TestAdmin.test_checkupgrade1 _________________________ self = monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x7f0259ed1670> def test_checkupgrade1(self, monkeypatch): monkeypatch.setattr(omero.plugins.prefs, "UpgradeCheck", createUpgradeCheckClass("0.0.0")) self.args.append("checkupgrade") with pytest.raises(NonZeroReturnCode) as exc: self.go() > assert exc.value.rv == 1 E AssertionError: assert 2 == 1 E + where 2 = NonZeroReturnCode('assert failed').rv E + where NonZeroReturnCode('assert failed') = .value test/integration/clitest/test_admin.py:74: AssertionError ----------------------------- Captured stdout call ----------------------------- HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=0.0.0&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) ----------------------------- Captured stderr call ----------------------------- Error printing text ------------------------------ Captured log call ------------------------------- ERROR omero.util.UpgradeCheck:upgrade_check.py:141 HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=0.0.0&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) _________________ TestAdminRestrictedAdmin.test_checkupgrade0 __________________ self = monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x7f026641eb20> def test_checkupgrade0(self, monkeypatch): monkeypatch.setattr(omero.plugins.prefs, "UpgradeCheck", createUpgradeCheckClass("999999999.0.0")) self.args.append("checkupgrade") > self.cli.invoke(self.args, strict=True) test/integration/clitest/test_admin.py:134: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/cli.py:1211: in invoke self.assertRC() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def assertRC(self): if self.rv != 0: > raise NonZeroReturnCode(self.rv, "assert failed") E omero.cli.NonZeroReturnCode: assert failed ../../../../.venv3/lib64/python3.9/site-packages/omero/cli.py:1200: NonZeroReturnCode ----------------------------- Captured stdout call ----------------------------- HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=999999999.0.0&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) ----------------------------- Captured stderr call ----------------------------- Error printing text ------------------------------ Captured log call ------------------------------- ERROR omero.util.UpgradeCheck:upgrade_check.py:141 HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=999999999.0.0&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) _________________ TestAdminRestrictedAdmin.test_checkupgrade1 __________________ self = monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x7f02664171c0> def test_checkupgrade1(self, monkeypatch): monkeypatch.setattr(omero.plugins.prefs, "UpgradeCheck", createUpgradeCheckClass("0.0.0")) self.args.append("checkupgrade") with pytest.raises(NonZeroReturnCode) as exc: self.cli.invoke(self.args, strict=True) > assert exc.value.rv == 1 E AssertionError: assert 2 == 1 E + where 2 = NonZeroReturnCode('assert failed').rv E + where NonZeroReturnCode('assert failed') = .value test/integration/clitest/test_admin.py:142: AssertionError ----------------------------- Captured stdout call ----------------------------- HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=0.0.0&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) ----------------------------- Captured stderr call ----------------------------- Error printing text ------------------------------ Captured log call ------------------------------- ERROR omero.util.UpgradeCheck:upgrade_check.py:141 HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=0.0.0&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) _____________ TestGetObject.testGetObjectsByMapAnnotations[Image] ______________ self = datatype = 'Image' @pytest.mark.parametrize("datatype", ['Image', 'Dataset', 'Project', 'Screen', 'Plate']) def testGetObjectsByMapAnnotations(self, datatype): > client, exp = self.new_client_and_user() test/integration/gatewaytest/test_get_objects.py:767: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:604: in new_client_and_user user = self.new_user(group, owner=owner, system=system, perms=perms, ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:526: in new_user admin_service = cls.root.getSession().getAdminService() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = session-a7a64e87-1558-47a8-b2c2-820dfc5775a0/6c99c8c6-60ac-4368-bc0a-6d4a4ada2232 -t -e 1.1 @ BlitzAdapters _ctx = None def getAdminService(self, _ctx=None): > return _M_omero.api.ServiceFactory._op_getAdminService.invoke(self, ((), _ctx)) E Ice.ConnectionLostException: Ice.ConnectionLostException: E recv() returned zero ../../../../.venv3/lib64/python3.9/site-packages/omero_API_ice.py:704: ConnectionLostException ____________ TestGetObject.testGetObjectsByMapAnnotations[Dataset] _____________ self = datatype = 'Dataset' @pytest.mark.parametrize("datatype", ['Image', 'Dataset', 'Project', 'Screen', 'Plate']) def testGetObjectsByMapAnnotations(self, datatype): > client, exp = self.new_client_and_user() test/integration/gatewaytest/test_get_objects.py:767: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:604: in new_client_and_user user = self.new_user(group, owner=owner, system=system, perms=perms, ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:526: in new_user admin_service = cls.root.getSession().getAdminService() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = session-a7a64e87-1558-47a8-b2c2-820dfc5775a0/6c99c8c6-60ac-4368-bc0a-6d4a4ada2232 -t -e 1.1 @ BlitzAdapters _ctx = None def getAdminService(self, _ctx=None): > return _M_omero.api.ServiceFactory._op_getAdminService.invoke(self, ((), _ctx)) E Ice.ConnectionLostException: Ice.ConnectionLostException: E recv() returned zero ../../../../.venv3/lib64/python3.9/site-packages/omero_API_ice.py:704: ConnectionLostException ____________ TestGetObject.testGetObjectsByMapAnnotations[Project] _____________ self = datatype = 'Project' @pytest.mark.parametrize("datatype", ['Image', 'Dataset', 'Project', 'Screen', 'Plate']) def testGetObjectsByMapAnnotations(self, datatype): > client, exp = self.new_client_and_user() test/integration/gatewaytest/test_get_objects.py:767: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:604: in new_client_and_user user = self.new_user(group, owner=owner, system=system, perms=perms, ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:526: in new_user admin_service = cls.root.getSession().getAdminService() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = session-a7a64e87-1558-47a8-b2c2-820dfc5775a0/6c99c8c6-60ac-4368-bc0a-6d4a4ada2232 -t -e 1.1 @ BlitzAdapters _ctx = None def getAdminService(self, _ctx=None): > return _M_omero.api.ServiceFactory._op_getAdminService.invoke(self, ((), _ctx)) E Ice.ConnectionLostException: Ice.ConnectionLostException: E recv() returned zero ../../../../.venv3/lib64/python3.9/site-packages/omero_API_ice.py:704: ConnectionLostException _____________ TestGetObject.testGetObjectsByMapAnnotations[Screen] _____________ self = datatype = 'Screen' @pytest.mark.parametrize("datatype", ['Image', 'Dataset', 'Project', 'Screen', 'Plate']) def testGetObjectsByMapAnnotations(self, datatype): > client, exp = self.new_client_and_user() test/integration/gatewaytest/test_get_objects.py:767: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:604: in new_client_and_user user = self.new_user(group, owner=owner, system=system, perms=perms, ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:526: in new_user admin_service = cls.root.getSession().getAdminService() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = session-a7a64e87-1558-47a8-b2c2-820dfc5775a0/6c99c8c6-60ac-4368-bc0a-6d4a4ada2232 -t -e 1.1 @ BlitzAdapters _ctx = None def getAdminService(self, _ctx=None): > return _M_omero.api.ServiceFactory._op_getAdminService.invoke(self, ((), _ctx)) E Ice.ConnectionLostException: Ice.ConnectionLostException: E recv() returned zero ../../../../.venv3/lib64/python3.9/site-packages/omero_API_ice.py:704: ConnectionLostException _____________ TestGetObject.testGetObjectsByMapAnnotations[Plate] ______________ self = datatype = 'Plate' @pytest.mark.parametrize("datatype", ['Image', 'Dataset', 'Project', 'Screen', 'Plate']) def testGetObjectsByMapAnnotations(self, datatype): > client, exp = self.new_client_and_user() test/integration/gatewaytest/test_get_objects.py:767: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:604: in new_client_and_user user = self.new_user(group, owner=owner, system=system, perms=perms, ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:526: in new_user admin_service = cls.root.getSession().getAdminService() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = session-a7a64e87-1558-47a8-b2c2-820dfc5775a0/6c99c8c6-60ac-4368-bc0a-6d4a4ada2232 -t -e 1.1 @ BlitzAdapters _ctx = None def getAdminService(self, _ctx=None): > return _M_omero.api.ServiceFactory._op_getAdminService.invoke(self, ((), _ctx)) E Ice.ConnectionLostException: Ice.ConnectionLostException: E recv() returned zero ../../../../.venv3/lib64/python3.9/site-packages/omero_API_ice.py:704: ConnectionLostException --------------------------- Captured stderr teardown --------------------------- 2024-10-24 04:50:57,745 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) *2024-10-24 04:50:57,791 INFO [ omero.gateway] (MainThread) created connection (uuid=82a534de-59a2-4157-985f-2e8e5832cc19) *2024-10-24 04:50:57,950 INFO [ omero.gateway] (MainThread) created connection (uuid=56cf2dfc-82bb-4d97-ab5f-87171446617e) *2024-10-24 04:50:58,080 INFO [ omero.gateway] (MainThread) created connection (uuid=32feaf89-b941-4243-8459-e5e13179b62d) *2024-10-24 04:50:58,161 INFO [ omero.gateway] (MainThread) created connection (uuid=b09ce4aa-4eee-481e-ae85-d9247e6f451a) *2024-10-24 04:50:58,248 INFO [ omero.gateway] (MainThread) created connection (uuid=d7752107-4369-4722-8ac3-932918687fa5) *2024-10-24 04:50:58,335 INFO [ omero.gateway] (MainThread) created connection (uuid=1a6fb1a2-a92f-474c-916e-cc524adbfc32) 2024-10-24 04:50:58,378 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:50:58,384 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:50:58,422 INFO [ omero.gateway] (MainThread) created connection (uuid=0b0f5472-390e-4634-b0ed-c6e21dbb70af) 2024-10-24 04:50:58,672 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 04:50:58,676 WARNI [ omero.client] (MainThread) Cannot get session service for killSession. Using closeSession ---------------------------- Captured log teardown ----------------------------- INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=82a534de-59a2-4157-985f-2e8e5832cc19) INFO omero.gateway:__init__.py:2243 created connection (uuid=56cf2dfc-82bb-4d97-ab5f-87171446617e) INFO omero.gateway:__init__.py:2243 created connection (uuid=32feaf89-b941-4243-8459-e5e13179b62d) INFO omero.gateway:__init__.py:2243 created connection (uuid=b09ce4aa-4eee-481e-ae85-d9247e6f451a) INFO omero.gateway:__init__.py:2243 created connection (uuid=d7752107-4369-4722-8ac3-932918687fa5) INFO omero.gateway:__init__.py:2243 created connection (uuid=1a6fb1a2-a92f-474c-916e-cc524adbfc32) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=0b0f5472-390e-4634-b0ed-c6e21dbb70af) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) WARNING omero.client:clients.py:1095 Cannot get session service for killSession. Using closeSession _____________________________ TestUser.testSaveAs ______________________________ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.9/urllib/request.py:1346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.9/http/client.py:1285: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.9/http/client.py:1331: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1280: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.9/http/client.py:1040: in _send_output self.send(msg) /usr/lib64/python3.9/http/client.py:980: in send self.connect() /usr/lib64/python3.9/http/client.py:1447: in connect super().connect() /usr/lib64/python3.9/http/client.py:946: in connect self.sock = self._create_connection( /usr/lib64/python3.9/socket.py:844: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('downloads.openmicroscopy.org', 443) timeout = , source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E TimeoutError: [Errno 110] Connection timed out /usr/lib64/python3.9/socket.py:832: TimeoutError During handling of the above exception, another exception occurred: self = gatewaywrapper = def testSaveAs(self, gatewaywrapper): for u in (gatewaywrapper.AUTHOR, gatewaywrapper.ADMIN): # Test image should be owned by author gatewaywrapper.loginAsAuthor() > image = gatewaywrapper.getTestImage(autocreate=True) test/integration/gatewaytest/test_user.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/testdb_create.py:150: in getTestImage return dbhelpers.getImage(self.gateway, 'testimg1', forceds=dataset, ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:501: in getImage i = IMAGES[alias].create() ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/scripts/dbhelpers.py:419: in create fin = urllib.request.urlopen(TESTIMG_URL + self.filename) /usr/lib64/python3.9/urllib/request.py:214: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.9/urllib/request.py:517: in open response = self._open(req, data) /usr/lib64/python3.9/urllib/request.py:534: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.9/urllib/request.py:494: in _call_chain result = func(*args) /usr/lib64/python3.9/urllib/request.py:1389: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': None} host = 'downloads.openmicroscopy.org' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.9/urllib/request.py:1349: URLError ----------------------------- Captured stderr call ----------------------------- 2024-10-24 06:40:14,382 INFO [ omero.gateway] (MainThread) closed connection (uuid=None) 2024-10-24 06:40:14,420 INFO [ omero.gateway] (MainThread) created connection (uuid=c60e5076-573b-4fe0-817b-67779e52e9c1) 2024-10-24 06:40:14,499 INFO [ omero.gateway] (MainThread) created connection (uuid=c9dedd32-4fa4-459d-8f8b-0082b7d0fd38) I< ------------------------------ Captured log call ------------------------------- INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=c60e5076-573b-4fe0-817b-67779e52e9c1) INFO omero.gateway:__init__.py:2243 created connection (uuid=c9dedd32-4fa4-459d-8f8b-0082b7d0fd38) __________________________ TestUpgradeCheck.testReal ___________________________ self = def testReal(self): uc = UpgradeCheck("test", version="test") uc.run() > assert uc.isUpgradeNeeded() is True E assert False is True E + where False = isUpgradeNeeded() E + where isUpgradeNeeded = .isUpgradeNeeded test/integration/test_util.py:35: AssertionError ----------------------------- Captured stderr call ----------------------------- 2024-10-24 07:27:51,555 ERROR [ omero.util.UpgradeCheck] (MainThread) HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=test&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) ------------------------------ Captured log call ------------------------------- ERROR omero.util.UpgradeCheck:upgrade_check.py:141 HTTPConnectionPool(host='upgrade.openmicroscopy.org.uk', port=80): Max retries exceeded with url: /?version=test&os.name=Linux&os.arch=x86_64&os.version=Linux-5.14.0-427.40.1.el9_4.x86_64-x86_64-with-glibc2.34&python.version=3.9.18&python.compiler=GCC+11.4.1+20231218+%28Red+Hat+11.4.1-3%29&python.build=main&python.build=Aug+23+2024+00%3A00%3A00 (Caused by ConnectTimeoutError(, 'Connection to upgrade.openmicroscopy.org.uk timed out. (connect timeout=6.0)')) =============================== warnings summary =============================== ../../../../.venv3/lib64/python3.9/site-packages/Ice.py:14 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/Ice.py:14: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses import sys, string, imp, os, threading, warnings, datetime ../../../../.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:241 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:241: RemovedInDjango50Warning: The default value of USE_TZ will change from False to True in Django 5.0. Set USE_TZ to False in your project settings if you want to keep the current default behavior. warnings.warn( ../../../../.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:289 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:289: RemovedInDjango51Warning: The STATICFILES_STORAGE setting is deprecated. Use STORAGES instead. warnings.warn(STATICFILES_STORAGE_DEPRECATED_MSG, RemovedInDjango51Warning) ../../../../.venv3/lib64/python3.9/site-packages/pipeline/__init__.py:1 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/pipeline/__init__.py:1: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html from pkg_resources import DistributionNotFound, get_distribution OmeroPy/test/integration/clitest/test_admin.py: 2 warnings OmeroPy/test/integration/clitest/test_chgrp.py: 116 warnings OmeroPy/test/integration/clitest/test_chown.py: 76 warnings OmeroPy/test/integration/clitest/test_cleanse.py: 7 warnings OmeroPy/test/integration/clitest/test_delete.py: 81 warnings OmeroPy/test/integration/clitest/test_download.py: 91 warnings OmeroPy/test/integration/clitest/test_duplicate.py: 18 warnings OmeroPy/test/integration/clitest/test_fs.py: 26 warnings OmeroPy/test/integration/clitest/test_group.py: 154 warnings OmeroPy/test/integration/clitest/test_import.py: 189 warnings OmeroPy/test/integration/clitest/test_import_bulk.py: 1 warning OmeroPy/test/integration/clitest/test_ldap.py: 5 warnings OmeroPy/test/integration/clitest/test_metadata.py: 10 warnings OmeroPy/test/integration/clitest/test_obj.py: 56 warnings OmeroPy/test/integration/clitest/test_pyramids.py: 10 warnings OmeroPy/test/integration/clitest/test_script.py: 10 warnings OmeroPy/test/integration/clitest/test_search.py: 16 warnings OmeroPy/test/integration/clitest/test_sessions.py: 121 warnings OmeroPy/test/integration/clitest/test_tag.py: 57 warnings OmeroPy/test/integration/clitest/test_upload.py: 6 warnings OmeroPy/test/integration/clitest/test_user.py: 183 warnings /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/plugins/sessions.py:176: DeprecationWarning: OMERO_SESSION_DIR is deprecated. Use OMERO_SESSIONDIR instead. warnings.warn( OmeroPy/test/integration/clitest/test_delete.py::TestDelete::testInputWithElisionDefault[1] OmeroPy/test/integration/clitest/test_delete.py::TestDelete::testInputWithElisionDefault[2] OmeroPy/test/integration/clitest/test_delete.py::TestDelete::testInputWithElisionDefault[3] /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/cli.py:2105: DeprecationWarning: Using '--dry-run'. Future versions will switch to '--force'. Explicitly set the parameter for portability warnings.warn("\nUsing '--dry-run'.\ OmeroPy/test/integration/clitest/test_import_bulk.py::TestImportBulk::testBulk /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_import_bulk.py:131: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testBulk1') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_obj.py::TestObj::test_get_unit_and_value /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_model_PixelsI.py:37: DeprecationWarning: Pixels.relatedTo is deprecated warnings.warn(item + " is deprecated", DeprecationWarning) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLogin[None-True] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLogin_None_True_0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLogin[None-False] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLogin_None_False_0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLogin[300-True] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLogin_300_True_0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLogin[300-False] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLogin_300_False_0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLoginAs[rw----] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLoginAs_rw_____0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLoginAs[rwr---] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLoginAs_rwr____0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLoginAs[rwra--] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLoginAs_rwra___0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLoginAs[rwrw--] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLoginAs_rwrw___0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLoginMultiGroup[True-True] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLoginMultiGroup_True_True_0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLoginMultiGroup[True-False] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLoginMultiGroup_True_False0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLoginMultiGroup[False-True] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLoginMultiGroup_False_True0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testLoginMultiGroup[False-False] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testLoginMultiGroup_False_Fals0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testGroup /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testGroup0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testTimeout /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testTimeout0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testFile /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testFile0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testKey /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testKey0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testWho[user] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testWho_user_0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::testWho[root] /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/testWho_root_0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::test_open /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/test_open0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::test_open_with_id /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/test_open_with_id0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::test_open_restricted_admin_no_sudo /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/test_open_restricted_admin_no_0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::test_open_restricted_admin_sudo /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/test_open_restricted_admin_sud0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/clitest/test_sessions.py::TestSessions::test_close /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/clitest/test_sessions.py:36: PytestWarning: Value of environment variable OMERO_SESSIONDIR type should be str, but got local('/tmp/pytest-of-omero/pytest-18/test_close0') (type: LocalPath); converted to str implicitly monkeypatch.setenv("OMERO_SESSIONDIR", tmpdir) OmeroPy/test/integration/fstest/test_rename.py::TestRename::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/fstest/test_rename.py::TestRename::test_dir returned '9ac4e97b-3b23-4d9c-b3fb-68e3456e0717_3717/631d7c9a-1b75-42ec-a1f1-ac65290924e0', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/gatewaytest/test_annotation.py: 4 warnings OmeroPy/test/integration/gatewaytest/test_chmod.py: 28 warnings OmeroPy/test/integration/gatewaytest/test_connection.py: 10 warnings OmeroPy/test/integration/gatewaytest/test_get_objects.py: 3 warnings OmeroPy/test/integration/gatewaytest/test_image_wrapper.py: 3 warnings OmeroPy/test/integration/gatewaytest/test_user.py: 1 warning /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:4810: DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead logger.warn("%s on %s to <%s> %s(%r, %r)", OmeroPy/test/integration/gatewaytest/test_annotation.py: 13 warnings OmeroPy/test/integration/gatewaytest/test_chgrp.py: 4 warnings OmeroPy/test/integration/gatewaytest/test_chmod.py: 3 warnings OmeroPy/test/integration/gatewaytest/test_get_objects.py: 25 warnings OmeroPy/test/integration/gatewaytest/test_missing_pyramid.py: 5 warnings OmeroPy/test/integration/gatewaytest/test_services.py: 2 warnings /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:3824: DeprecationWarning: tostring() is deprecated. Use tobytes() instead. convertedPlane = byteSwappedPlane.tostring() OmeroPy/test/integration/gatewaytest/test_delete.py::TestDelete::testDeleteObjectDirect /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:4381: DeprecationWarning: Deprecated. Use deleteObject() or deleteObjects() warnings.warn( OmeroPy/test/integration/gatewaytest/test_fs.py::TestFileset::testGetArchivedFiles OmeroPy/test/integration/gatewaytest/test_fs.py::TestFileset::testGetArchivedFiles OmeroPy/test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testGetArchivedFiles OmeroPy/test/integration/gatewaytest/test_fs.py::TestArchivedOriginalFiles::testGetArchivedFiles /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:10236: DeprecationWarning: Deprecated. Use getImportedImageFiles() warnings.warn( OmeroPy/test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGroupSummaryAsOwnerDeprecated OmeroPy/test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGroupSummaryAsOwnerDeprecated OmeroPy/test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGroupSummaryAsMemberDeprecated OmeroPy/test/integration/gatewaytest/test_get_objects.py::TestLeaderAndMemberOfGroup::testGroupSummaryAsMemberDeprecated /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:3048: DeprecationWarning: Deprecated. Use ExperimenterGroupWrapper.groupSummary() warnings.warn( OmeroPy/test/integration/gatewaytest/test_ticket10618.py: 24 warnings OmeroPy/test/integration/scriptstest/test_roi_handling_utils.py: 1 warning OmeroPy/test/integration/scriptstest/test_script_utils.py: 150 warnings OmeroPy/test/integration/test_exporter.py: 1 warning OmeroPy/test/integration/test_ishare.py: 3 warnings OmeroPy/test/integration/test_librarytest.py: 31 warnings OmeroPy/test/integration/test_permissions.py: 2 warnings OmeroPy/test/integration/test_render.py: 1 warning OmeroPy/test/integration/test_rois.py: 2 warnings OmeroPy/test/integration/test_scripts.py: 1 warning OmeroPy/test/integration/test_thumbnailPerms.py: 55 warnings OmeroPy/test/integration/test_tickets1000.py: 1 warning /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/script_utils.py:1093: DeprecationWarning: tostring() is deprecated. Use tobytes() instead. converted_plane = byte_swapped_plane.tostring() OmeroPy/test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[None-Screen2Plates] OmeroPy/test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[1-Screen2Plates] OmeroPy/test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[10-Screen2Plates] /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/populate_metadata.py:905: DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead log.warn("PlateColumn is unimplemented") OmeroPy/test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[None-Dataset2Images1Missing] OmeroPy/test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[1-Dataset2Images1Missing] OmeroPy/test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[10-Dataset2Images1Missing] /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/populate_metadata.py:955: DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead log.warn( OmeroPy/test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[None-Dataset2Images1Missing] OmeroPy/test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[1-Dataset2Images1Missing] OmeroPy/test/integration/metadata/test_populate.py::TestPopulateMetadata::testPopulateMetadata[10-Dataset2Images1Missing] /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/populate_metadata.py:1389: DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead log.warn("Invalid Id:%d found in row %s", row[n], row) OmeroPy/test/integration/scriptsharness/test_harness.py::TestScriptsViaOmeroCli::testDefinition OmeroPy/test/integration/scriptsharness/test_harness.py::TestScriptsViaOmeroCli::testDefinition OmeroPy/test/integration/scriptsharness/test_harness.py::TestScriptsViaOmeroCli::testSimpleScript OmeroPy/test/integration/scriptsharness/test_harness.py::TestScriptsViaOmeroCli::testSimpleScript :77: DeprecationWarning: This plugin is deprecated as of OMERO 5.5.0. Use the upload CLI plugin available from https://pypi.org/project/omero-upload/ instead. OmeroPy/test/integration/scriptstest/test_coverage.py::TestCoverage::testUploadAndScript /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/scriptstest/test_coverage.py::TestCoverage::testUploadAndScript returned 4958, which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_numpy_save_as_image[True-tiff] /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/unraisableexception.py:85: PytestUnraisableExceptionWarning: Exception ignored in: Traceback (most recent call last): File "/usr/lib64/python3.9/tempfile.py", line 461, in __del__ self.close() File "/usr/lib64/python3.9/tempfile.py", line 457, in close unlink(self.name) FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmp5b940e8l.tiff' warnings.warn(pytest.PytestUnraisableExceptionWarning(msg)) OmeroPy/test/integration/scriptstest/test_script_utils.py::TestScriptUtils::test_numpy_save_as_image[True-foo] /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/unraisableexception.py:85: PytestUnraisableExceptionWarning: Exception ignored in: Traceback (most recent call last): File "/usr/lib64/python3.9/tempfile.py", line 461, in __del__ self.close() File "/usr/lib64/python3.9/tempfile.py", line 457, in close unlink(self.name) FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmp15yacxcm.foo' warnings.warn(pytest.PytestUnraisableExceptionWarning(msg)) OmeroPy/test/integration/scriptstest/test_script_utils.py: 200 warnings /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/script_utils.py:1131: DeprecationWarning: tostring() is deprecated. Use tobytes() instead. converted_row = row.tostring() OmeroPy/test/integration/tablestest/test_backwards_compatibility.py: 12 warnings /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/tablestest/test_backwards_compatibility.py:84: DeprecationWarning: The binary mode of fromstring is deprecated, as it behaves surprisingly on unicode inputs. Use frombuffer instead return numpy.fromstring(x, count=len(x), dtype=tables.UInt8Atom()) OmeroPy/test/integration/tablestest/test_service.py::TestTables::testBlankTable /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/tablestest/test_service.py::TestTables::testBlankTable returned object #0 (::omero::model::OriginalFile) { _id = object #1 (::omero::RLong) { _val = 5014 } _details = object #2 (::omero::model::Details) { _owner = object #3 (::omero::model::Experimenter) { _id = object #4 (::omero::RLong) { _val = 3847 } _details = object #5 (::omero::model::Details) { _owner = _group = _creationEvent = _updateEvent = _permissions = object #6 (::omero::model::Permissions) { _restrictions = { [0] = False [1] = True [2] = True [3] = True [4] = True [5] = True } _extendedRestrictions = { } _perm1 = -120 } _externalInfo = _call = { key = omero.client.uuid value = 3cc57eaa-dc1d-4608-a74e-47ea4e5aae8f key = omero.session.uuid value = 294224bd-db09-411a-93f2-3c3793f23815 } _event = object #7 (::omero::sys::EventContext) { shareId = -1 sessionId = 8772 sessionUuid = ********* userId = 3847 userName = ab798f25-d3b0-4156-886e-fdbee11b3dc2 sudoerId = sudoerName = groupId = 2824 groupName = aaea2363-bca0-4bd0-8f0d-8ec943340202 isAdmin = False adminPrivileges = { } eventId = 80630 eventType = User memberOfGroups = { [0] = 2824 [1] = 1 } leaderOfGroups = { } groupPermissions = object #8 (::omero::model::Permissions) { _restrictions = { } _extendedRestrictions = { } _perm1 = -120 } } } _loaded = True _version = object #9 (::omero::RInt) { _val = 0 } _groupExperimenterMapSeq = { } _groupExperimenterMapLoaded = False _omeName = object #10 (::omero::RString) { _val = ab798f25-d3b0-4156-886e-fdbee11b3dc2 } _firstName = object #11 (::omero::RString) { _val = ab798f25-d3b0-4156-886e-fdbee11b3dc2 } _middleName = _lastName = object #12 (::omero::RString) { _val = ab798f25-d3b0-4156-886e-fdbee11b3dc2 } _institution = _ldap = object #13 (::omero::RBool) { _val = False } _email = object #14 (::omero::RString) { _val = } _config = { } _annotationLinksSeq = { } _annotationLinksLoaded = False _annotationLinksCountPerOwner = { } } _group = object #15 (::omero::model::ExperimenterGroup) { _id = object #16 (::omero::RLong) { _val = 2824 } _details = object #17 (::omero::model::Details) { _owner = _group = _creationEvent = _updateEvent = _permissions = object #18 (::omero::model::Permissions) { _restrictions = { [0] = False [1] = True [2] = True [3] = True [4] = True [5] = True } _extendedRestrictions = { } _perm1 = -120 } _externalInfo = _call = { key = omero.client.uuid value = 3cc57eaa-dc1d-4608-a74e-47ea4e5aae8f key = omero.session.uuid value = 294224bd-db09-411a-93f2-3c3793f23815 } _event = object #19 (::omero::sys::EventContext) { shareId = -1 sessionId = 8772 sessionUuid = ********* userId = 3847 userName = ab798f25-d3b0-4156-886e-fdbee11b3dc2 sudoerId = sudoerName = groupId = 2824 groupName = aaea2363-bca0-4bd0-8f0d-8ec943340202 isAdmin = False adminPrivileges = { } eventId = 80630 eventType = User memberOfGroups = { [0] = 2824 [1] = 1 } leaderOfGroups = { } groupPermissions = object #20 (::omero::model::Permissions) { _restrictions = { } _extendedRestrictions = { } _perm1 = -120 } } } _loaded = True _version = _name = object #21 (::omero::RString) { _val = aaea2363-bca0-4bd0-8f0d-8ec943340202 } _ldap = _groupExperimenterMapSeq = { } _groupExperimenterMapLoaded = False _config = { } _annotationLinksSeq = { } _annotationLinksLoaded = False _annotationLinksCountPerOwner = { } _description = } _creationEvent = object #22 (::omero::model::Event) { _id = object #23 (::omero::RLong) { _val = 80630 } _details = object #24 (::omero::model::Details) { _owner = _group = _creationEvent = _updateEvent = _permissions = object #25 (::omero::model::Permissions) { _restrictions = { [0] = False [1] = True [2] = True [3] = True [4] = True [5] = True } _extendedRestrictions = { } _perm1 = -120 } _externalInfo = _call = { key = omero.client.uuid value = 3cc57eaa-dc1d-4608-a74e-47ea4e5aae8f key = omero.session.uuid value = 294224bd-db09-411a-93f2-3c3793f23815 } _event = object #26 (::omero::sys::EventContext) { shareId = -1 sessionId = 8772 sessionUuid = ********* userId = 3847 userName = ab798f25-d3b0-4156-886e-fdbee11b3dc2 sudoerId = sudoerName = groupId = 2824 groupName = aaea2363-bca0-4bd0-8f0d-8ec943340202 isAdmin = False adminPrivileges = { } eventId = 80630 eventType = User memberOfGroups = { [0] = 2824 [1] = 1 } leaderOfGroups = { } groupPermissions = object #27 (::omero::model::Permissions) { _restrictions = { } _extendedRestrictions = { } _perm1 = -120 } } } _loaded = True _status = _time = object #28 (::omero::RTime) { _val = 1729749305596 } _experimenter = _experimenterGroup = _type = object #29 (::omero::model::EventType) { _id = object #30 (::omero::RLong) { _val = 4 } _details = object #31 (::omero::model::Details) { _owner = _group = _creationEvent = _updateEvent = _permissions = object #32 (::omero::model::Permissions) { _restrictions = { [0] = False [1] = True [2] = True [3] = False [4] = True [5] = True } _extendedRestrictions = { } _perm1 = -120 } _externalInfo = _call = { key = omero.client.uuid value = 3cc57eaa-dc1d-4608-a74e-47ea4e5aae8f key = omero.session.uuid value = 294224bd-db09-411a-93f2-3c3793f23815 } _event = object #33 (::omero::sys::EventContext) { shareId = -1 sessionId = 8772 sessionUuid = ********* userId = 3847 userName = ab798f25-d3b0-4156-886e-fdbee11b3dc2 sudoerId = sudoerName = groupId = 2824 groupName = aaea2363-bca0-4bd0-8f0d-8ec943340202 isAdmin = False adminPrivileges = { } eventId = 80630 eventType = User memberOfGroups = { [0] = 2824 [1] = 1 } leaderOfGroups = { } groupPermissions = object #34 (::omero::model::Permissions) { _restrictions = { } _extendedRestrictions = { } _perm1 = -120 } } } _loaded = True _value = object #35 (::omero::RString) { _val = User } } _containingEvent = _logsSeq = { } _logsLoaded = True _session = object #36 (::omero::model::Session) { _id = object #37 (::omero::RLong) { _val = 8772 } _details = object #38 (::omero::model::Details) { _owner = _group = _creationEvent = _updateEvent = _permissions = object #39 (::omero::model::Permissions) { _restrictions = { [0] = False [1] = True [2] = True [3] = False [4] = True [5] = True } _extendedRestrictions = { } _perm1 = -120 } _externalInfo = _call = { key = omero.client.uuid value = 3cc57eaa-dc1d-4608-a74e-47ea4e5aae8f key = omero.session.uuid value = 294224bd-db09-411a-93f2-3c3793f23815 } _event = object #40 (::omero::sys::EventContext) { shareId = -1 sessionId = 8772 sessionUuid = ********* userId = 3847 userName = ab798f25-d3b0-4156-886e-fdbee11b3dc2 sudoerId = sudoerName = groupId = 2824 groupName = aaea2363-bca0-4bd0-8f0d-8ec943340202 isAdmin = False adminPrivileges = { } eventId = 80630 eventType = User memberOfGroups = { [0] = 2824 [1] = 1 } leaderOfGroups = { } groupPermissions = object #41 (::omero::model::Permissions) { _restrictions = { } _extendedRestrictions = { } _perm1 = -120 } } } _loaded = True _version = _node = object #42 (::omero::model::Node) { _id = object #43 (::omero::RLong) { _val = 1 } _details = _loaded = False _version = _sessionsSeq = { } _sessionsLoaded = False _uuid = _conn = _up = _down = _scale = _annotationLinksSeq = { } _annotationLinksLoaded = False _annotationLinksCountPerOwner = { } } _uuid = object #44 (::omero::RString) { _val = ******** } _owner = _sudoer = _timeToIdle = object #45 (::omero::RLong) { _val = 600000 } _timeToLive = object #46 (::omero::RLong) { _val = 0 } _started = object #47 (::omero::RTime) { _val = 1729749305546 } _closed = _message = object #48 (::omero::RString) { _val = Initial message. } _defaultEventType = object #49 (::omero::RString) { _val = User } _userAgent = object #50 (::omero::RString) { _val = OMERO.py.test } _userIP = _eventsSeq = { } _eventsLoaded = False _annotationLinksSeq = { } _annotationLinksLoaded = True _annotationLinksCountPerOwner = { } } } _updateEvent = _permissions = object #51 (::omero::model::Permissions) { _restrictions = { [0] = False [1] = False [2] = False [3] = False [4] = False [5] = True } _extendedRestrictions = { } _perm1 = -120 } _externalInfo = _call = { key = omero.client.uuid value = 3cc57eaa-dc1d-4608-a74e-47ea4e5aae8f key = omero.session.uuid value = 294224bd-db09-411a-93f2-3c3793f23815 } _event = object #52 (::omero::sys::EventContext) { shareId = -1 sessionId = 8772 sessionUuid = ********* userId = 3847 userName = ab798f25-d3b0-4156-886e-fdbee11b3dc2 sudoerId = sudoerName = groupId = 2824 groupName = aaea2363-bca0-4bd0-8f0d-8ec943340202 isAdmin = False adminPrivileges = { } eventId = 80630 eventType = User memberOfGroups = { [0] = 2824 [1] = 1 } leaderOfGroups = { } groupPermissions = object #53 (::omero::model::Permissions) { _restrictions = { } _extendedRestrictions = { } _perm1 = -120 } } } _loaded = True _version = _pixelsFileMapsSeq = { } _pixelsFileMapsLoaded = True _pixelsFileMapsCountPerOwner = { } _path = object #54 (::omero::RString) { _val = /test } _repo = _size = _atime = object #55 (::omero::RTime) { _val = 1729749305595 } _mtime = object #56 (::omero::RTime) { _val = 1729749305595 } _ctime = object #57 (::omero::RTime) { _val = 1729749305595 } _hasher = _hash = _mimetype = object #58 (::omero::RString) { _val = OMERO.tables } _filesetEntriesSeq = { } _filesetEntriesLoaded = True _annotationLinksSeq = { } _annotationLinksLoaded = True _annotationLinksCountPerOwner = { } _name = object #59 (::omero::RString) { _val = /test } }, which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/tablestest/test_service.py::TestTables::testMask OmeroPy/test/integration/tablestest/test_service.py::TestTables::testMask OmeroPy/test/integration/tablestest/test_service.py::TestTables::testAllColumnsSameTable OmeroPy/test/integration/tablestest/test_service.py::TestTables::testAllColumnsSameTable OmeroPy/test/integration/tablestest/test_service.py::TestTables::testAllColumnsSameTable OmeroPy/test/integration/tablestest/test_service.py::TestTables::testAllColumnsSameTable /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/tablestest/test_service.py:54: DeprecationWarning: The binary mode of fromstring is deprecated, as it behaves surprisingly on unicode inputs. Use frombuffer instead return numpy.fromstring(x, count=len(x), dtype=tables.UInt8Atom()) OmeroPy/test/integration/tablestest/test_tables.py::TestTableIntegrity::testAllColumnsAndMetadata OmeroPy/test/integration/tablestest/test_tables.py::TestTableIntegrity::testAllColumnsAndMetadata OmeroPy/test/integration/tablestest/test_tables.py::TestTableIntegrity::testAllColumnsAndMetadata OmeroPy/test/integration/tablestest/test_tables.py::TestTableIntegrity::testAllColumnsAndMetadata OmeroPy/test/integration/tablestest/test_tables.py::TestTableIntegrity::testAllColumnsAndMetadata OmeroPy/test/integration/tablestest/test_tables.py::TestTableIntegrity::testAllColumnsAndMetadata /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/test/integration/tablestest/test_tables.py:55: DeprecationWarning: The binary mode of fromstring is deprecated, as it behaves surprisingly on unicode inputs. Use frombuffer instead return numpy.fromstring(x, count=len(x), dtype=tables.UInt8Atom()) OmeroPy/test/integration/test_isession.py::TestISession::testJoinSession_Helper /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_isession.py::TestISession::testJoinSession_Helper returned '187195e0-35cc-42f6-a6e7-dfc6a92eea78', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_isession.py::TestISession::testSessionWithIP[127.0.0.1] /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_model_SessionI.py:37: DeprecationWarning: Session.userIP is deprecated warnings.warn(item + " is deprecated", DeprecationWarning) OmeroPy/test/integration/test_ishare.py::TestIShare::test1201 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_ishare.py::TestIShare::test1201 returned (, 8987, 1729749868553), which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_ishare.py::TestIShare::test_OS_regular_user /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_ishare.py::TestIShare::test_OS_regular_user returned (object #0 (::omero::model::Image) { _id = object #1 (::omero::RLong) { _val = 6287 } _details = _loaded = False _version = _series = _acquisitionDate = _archived = _partial = _format = _imagingEnvironment = _objectiveSettings = _instrument = _stageLabel = _experiment = _pixelsSeq = {} _pixelsLoaded = False _wellSamplesSeq = {} _wellSamplesLoaded = False _roisSeq = {} _roisLoaded = False _datasetLinksSeq = {} _datasetLinksLoaded = False _datasetLinksCountPerOwner = { } _folderLinksSeq = {} _folderLinksLoaded = False _folderLinksCountPerOwner = { } _fileset = _annotationLinksSeq = {} _annotationLinksLoaded = False _annotationLinksCountPerOwner = { } _name = _description = }, 9022), which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_reporawfilestore.py::TestRepoRawFileStore::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_reporawfilestore.py::TestRepoRawFileStore::test_dir returned '7dc9c696-7d1f-4a1a-80c1-8dedcc112bcc_4195/588ab604-d342-4612-b6d5-2b397bfc0e40', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestRepository::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestRepository::test_dir returned '0f4c5ef3-1fec-4e23-b27c-7377dcd2800c_4196/9096f5f1-b21b-47c8-8d31-a99752c39060', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestFileExists::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestFileExists::test_dir returned 'a31af40b-6b54-415e-a0c5-ad1c2298204f_4197/b8f8b172-d2cd-4c21-885b-1597a65af41d', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestManagedRepositoryMultiUser::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestManagedRepositoryMultiUser::test_dir returned '725a0a55-da38-4a12-8663-08f84928829a_4198/1863c075-375f-4f57-99c4-5a4add33ab33', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestPythonImporter::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestPythonImporter::test_dir returned '4cffb696-b561-487b-afe3-0cb6092a767e_4210/70c1e38d-bfb8-47de-acef-daf5600a757c', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestRawAccess::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestRawAccess::test_dir returned '71d6cf77-849f-42c7-abfb-92e4abfec114_4215/fa7fa5fd-0652-432d-be3a-0807e2efd089', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestDbSync::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestDbSync::test_dir returned 'f812cfc6-adca-43a8-8365-258078c13047_4216/171d3de9-5891-45f2-b0ba-f0b1af0c0e6c', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestRecursiveDelete::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestRecursiveDelete::test_dir returned '6e8a5a46-d194-49cf-8716-51ff5d8275b2_4217/1428e588-2006-4fd2-bed8-7d895ad08c23', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestDeleteLog::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestDeleteLog::test_dir returned '9fd49165-6c21-4247-bf27-e30a93c70222_4218/3bfb9304-ab5c-4e9e-b5a0-e5a7f594edc9', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestUserTemplate::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestUserTemplate::test_dir returned '6d5396c2-0774-4929-9e9b-61ae2c412836_4219/1ef29036-8dee-467a-acbf-8e24c1237c05', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestFilesetQueries::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestFilesetQueries::test_dir returned 'c40b7891-7815-4fb5-a17a-2aa129670450_4220/6261f7f6-ad70-4b81-b6b8-339be837fb0a', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestOriginalMetadata::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestOriginalMetadata::test_dir returned 'fa661a73-9fdb-4fc6-8614-c65810942b66_4221/75afa949-0f4d-4fdc-a8ca-da4bd08f653d', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_repository.py::TestDeletePerformance::test_dir /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_repository.py::TestDeletePerformance::test_dir returned 'c0c8864a-e46e-4bbb-af9a-73b60a8046f7_4223/e6028bb7-2a73-4ef9-9065-4e31ad0ce1f3', which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_scripts.py::TestScripts::testBasicUsage /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_scripts.py::TestScripts::testBasicUsage returned 3aca1b8a-20a8-4f97-bff8-b7fc921c66e4/7ac0405b-1d0b-49fd-8a28-53b31bf113acomero.api.IScript -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000, which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_scripts.py::TestScripts::testUpload2562 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but OmeroPy/test/integration/test_scripts.py::TestScripts::testUpload2562 returned (b83246fa-be3c-49bd-8e8e-ce01686844f6/b45446cf-c8ff-43f1-8ac4-1b83ac2466b3omero.api.IScript -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000, object #0 (::omero::model::OriginalFile) { _id = object #1 (::omero::RLong) { _val = 5702 } _details = object #2 (::omero::model::Details) { _owner = object #3 (::omero::model::Experimenter) { _id = object #4 (::omero::RLong) { _val = 0 } _details = _loaded = False _version = _groupExperimenterMapSeq = { } _groupExperimenterMapLoaded = False _omeName = _firstName = _middleName = _lastName = _institution = _ldap = _email = _config = { } _annotationLinksSeq = { } _annotationLinksLoaded = False _annotationLinksCountPerOwner = { } } _group = object #5 (::omero::model::ExperimenterGroup) { _id = object #6 (::omero::RLong) { _val = 1 } _details = object #7 (::omero::model::Details) { _owner = _group = _creationEvent = _updateEvent = _permissions = object #8 (::omero::model::Permissions) { _restrictions = { [0] = False [1] = True [2] = True [3] = False [4] = True [5] = True } _extendedRestrictions = { } _perm1 = -52 } _externalInfo = _call = { key = omero.client.uuid value = 7ac0405b-1d0b-49fd-8a28-53b31bf113ac key = omero.session.uuid value = 3aca1b8a-20a8-4f97-bff8-b7fc921c66e4 } _event = object #9 (::omero::sys::EventContext) { shareId = -1 sessionId = 9290 sessionUuid = ********* userId = 4230 userName = abbc9306-92b6-4bd7-b478-f465a28ea60a sudoerId = sudoerName = groupId = 3148 groupName = 2e4786af-cac5-484e-99a6-e53b78a45f67 isAdmin = False adminPrivileges = { } eventId = -1 eventType = User memberOfGroups = { [0] = 3148 [1] = 1 } leaderOfGroups = { } groupPermissions = object #10 (::omero::model::Permissions) { _restrictions = { } _extendedRestrictions = { } _perm1 = -120 } } } _loaded = True _version = object #11 (::omero::RInt) { _val = 0 } _name = object #12 (::omero::RString) { _val = user } _ldap = object #13 (::omero::RBool) { _val = False } _groupExperimenterMapSeq = { } _groupExperimenterMapLoaded = False _config = { } _annotationLinksSeq = { } _annotationLinksLoaded = False _annotationLinksCountPerOwner = { } _description = } _creationEvent = object #14 (::omero::model::Event) { _id = object #15 (::omero::RLong) { _val = 99437 } _details = _loaded = False _status = _time = _experimenter = _experimenterGroup = _type = _containingEvent = _logsSeq = { } _logsLoaded = False _session = } _updateEvent = _permissions = object #16 (::omero::model::Permissions) { _restrictions = { [0] = False [1] = True [2] = True [3] = False [4] = True [5] = True } _extendedRestrictions = { } _perm1 = -120 } _externalInfo = _call = { key = omero.client.uuid value = 7ac0405b-1d0b-49fd-8a28-53b31bf113ac key = omero.session.uuid value = 3aca1b8a-20a8-4f97-bff8-b7fc921c66e4 } _event = object #17 (::omero::sys::EventContext) { shareId = -1 sessionId = 9290 sessionUuid = ********* userId = 4230 userName = abbc9306-92b6-4bd7-b478-f465a28ea60a sudoerId = sudoerName = groupId = 3148 groupName = 2e4786af-cac5-484e-99a6-e53b78a45f67 isAdmin = False adminPrivileges = { } eventId = -1 eventType = User memberOfGroups = { [0] = 3148 [1] = 1 } leaderOfGroups = { } groupPermissions = object #18 (::omero::model::Permissions) { _restrictions = { } _extendedRestrictions = { } _perm1 = -120 } } } _loaded = True _version = _pixelsFileMapsSeq = { } _pixelsFileMapsLoaded = False _pixelsFileMapsCountPerOwner = { } _path = object #19 (::omero::RString) { _val = / } _repo = object #20 (::omero::RString) { _val = ScriptRepo } _size = object #21 (::omero::RLong) { _val = 168 } _atime = _mtime = _ctime = _hasher = object #22 (::omero::model::ChecksumAlgorithm) { _id = object #23 (::omero::RLong) { _val = 6 } _details = _loaded = False _value = } _hash = object #24 (::omero::RString) { _val = 0232cafa500bc438e2aecf7a78348d70652575a6 } _mimetype = object #25 (::omero::RString) { _val = text/x-python } _filesetEntriesSeq = { } _filesetEntriesLoaded = True _annotationLinksSeq = { } _annotationLinksLoaded = False _annotationLinksCountPerOwner = { } _name = object #26 (::omero::RString) { _val = 124aa599-7496-47b5-99f2-df84ca8870a6.py } }), which will be an error in a future version of pytest. Did you mean to use `assert` instead of `return`? warnings.warn( OmeroPy/test/integration/test_scripts.py::TestScripts::testRunScript OmeroPy/test/integration/test_scripts.py::TestScripts::testRunScript /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/decorators.py:82: DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead log.warn( OmeroPy/test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testThumbs OmeroPy/test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testThumbs OmeroPy/test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testThumbs OmeroPy/test/integration/test_tickets1000.py::TestTicket1000::test880 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/script_utils.py:1313: DeprecationWarning: This method is deprecated as of OMERO 5.3.0 warnings.warn( OmeroPy/test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testThumbs OmeroPy/test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testThumbs OmeroPy/test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testThumbs OmeroPy/test/integration/test_tickets1000.py::TestTicket1000::test880 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/script_utils.py:1074: DeprecationWarning: This method is deprecated as of OMERO 5.3.0. Use upload_plane instead warnings.warn( OmeroPy/test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testThumbs OmeroPy/test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testThumbs OmeroPy/test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testThumbs OmeroPy/test/integration/test_tickets1000.py::TestTicket1000::test880 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/script_utils.py:1259: DeprecationWarning: This method is deprecated as of OMERO 5.3.0. Use reset_rendering_settings warnings.warn( -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/target/reports/integration/junit-results.xml - =========================== short test summary info ============================ FAILED test/integration/clitest/test_admin.py::TestAdmin::test_checkupgrade0 - omero.cli.NonZeroReturnCode: assert failed FAILED test/integration/clitest/test_admin.py::TestAdmin::test_checkupgrade1 - AssertionError: assert 2 == 1 + where 2 = NonZeroReturnCode('assert failed').rv + where NonZeroReturnCode('assert failed') = .value FAILED test/integration/clitest/test_admin.py::TestAdminRestrictedAdmin::test_checkupgrade0 - omero.cli.NonZeroReturnCode: assert failed FAILED test/integration/clitest/test_admin.py::TestAdminRestrictedAdmin::test_checkupgrade1 - AssertionError: assert 2 == 1 + where 2 = NonZeroReturnCode('assert failed').rv + where NonZeroReturnCode('assert failed') = .value FAILED test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetObjectsByMapAnnotations[Image] - Ice.ConnectionLostException: Ice.ConnectionLostException: recv() returned zero FAILED test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetObjectsByMapAnnotations[Dataset] - Ice.ConnectionLostException: Ice.ConnectionLostException: recv() returned zero FAILED test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetObjectsByMapAnnotations[Project] - Ice.ConnectionLostException: Ice.ConnectionLostException: recv() returned zero FAILED test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetObjectsByMapAnnotations[Screen] - Ice.ConnectionLostException: Ice.ConnectionLostException: recv() returned zero FAILED test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetObjectsByMapAnnotations[Plate] - Ice.ConnectionLostException: Ice.ConnectionLostException: recv() returned zero FAILED test/integration/gatewaytest/test_user.py::TestUser::testSaveAs - urllib.error.URLError: FAILED test/integration/test_util.py::TestUpgradeCheck::testReal - assert False is True + where False = isUpgradeNeeded() + where isUpgradeNeeded = .isUpgradeNeeded ERROR test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testClose - urllib.error.URLError: ERROR test/integration/gatewaytest/test_connection.py::TestConnectionMethods::testTopLevelObjects - urllib.error.URLError: ERROR test/integration/gatewaytest/test_get_objects.py::TestFindObject::testFindExperimenter - urllib.error.URLError: ERROR test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetAnnotations - urllib.error.URLError: ERROR test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetImage - urllib.error.URLError: ERROR test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetImageLoadPixels[True-True] - urllib.error.URLError: ERROR test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetImageLoadPixels[True-False] - urllib.error.URLError: ERROR test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetImageLoadPixels[False-True] - urllib.error.URLError: ERROR test/integration/gatewaytest/test_get_objects.py::TestGetObject::testGetImageLoadPixels[False-False] - urllib.error.URLError: ERROR test/integration/gatewaytest/test_get_objects.py::TestGetObject::testTraversal - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testThumbnail - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testThumbnailSet - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testRenderingModels - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testSplitChannel - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testLinePlots - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testProjections - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testProperties - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testPixelSizeUnits - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testUnitsGetValue - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testChannelWavelengthUnits - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testExposureTimeUnits - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testShortname - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testImageDate - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testSimpleMarshal - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testExport - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testRenderJpegRegion - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testRenderJpegRegion_resolution - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testRenderJpegRegion_invalid_resolution - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testRenderBirdsEyeView - urllib.error.URLError: ERROR test/integration/gatewaytest/test_image.py::TestImage::testRenderBirdsEyeView_Size - urllib.error.URLError: ERROR test/integration/gatewaytest/test_pixels.py::TestPixels::testReuseRawPixelsStore - urllib.error.URLError: ERROR test/integration/gatewaytest/test_pixels.py::TestPixels::testPlaneInfo - urllib.error.URLError: ERROR test/integration/gatewaytest/test_pixels.py::TestPixels::testPixelsType - urllib.error.URLError: ERROR test/integration/gatewaytest/test_pixels.py::TestPixels::testGetTile - urllib.error.URLError: ERROR test/integration/gatewaytest/test_pixels.py::TestPixels::testGetPlane - urllib.error.URLError: ERROR test/integration/gatewaytest/test_pixels.py::TestPixels::testGetPlanesExceptionOnGetPlane - urllib.error.URLError: ERROR test/integration/gatewaytest/test_pixels.py::TestPixels::testGetPlanesExceptionOnClose - urllib.error.URLError: ERROR test/integration/gatewaytest/test_pixels.py::TestPixels::testGetPlanesExceptionOnBoth - urllib.error.URLError: ERROR test/integration/gatewaytest/test_pixels.py::TestPixels::testGetHistogram - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testDefault - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testCustomized - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testChannelWindows - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testFloatDefaultMinMax - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testEmissionWave - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testBatchCopy - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testGroupBasedPermissions - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testGetRdefs - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testResetDefaults - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testQuantizationSettings - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testQuantizationSettingsInvalid - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testQuantizationSettingsBulk - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testGetChannelsNoRE - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testSetActiveChannelsNoRE - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testSetActiveChannelsWithRE - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::test_set_active_channels_set_inactive[True] - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::test_set_active_channels_set_inactive[False] - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testUnregisterService - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testRegisterService - urllib.error.URLError: ERROR test/integration/gatewaytest/test_rdefs.py::TestRDefs::testCloseRE - urllib.error.URLError: ERROR test/integration/gatewaytest/test_wrapper.py::TestWrapper::testAllObjectsWrapped - urllib.error.URLError: ERROR test/integration/gatewaytest/test_wrapper.py::TestWrapper::testDatasetWrapper - urllib.error.URLError: ERROR test/integration/gatewaytest/test_wrapper.py::TestWrapper::testDetailsWrapper - urllib.error.URLError: = 11 failed, 1943 passed, 9 skipped, 36 deselected, 1924 warnings, 62 errors in 15197.33s (4:13:17) = !! 10/24/24 07:27:53.707 error: 257 communicators not destroyed during global destruction. Result: 1 BUILD SUCCESSFUL Total time: 253 minutes 21 seconds + /home/omero/workspace/OMERO-test-integration/src/build.py -f components/tools/OmeroFS/build.xml integration -Dtestreports.dir=target/reports/integration OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0 Buildfile: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroFS/build.xml Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroFS... python-integration: Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroFS/target/reports/integration ============================= test session starts ============================== platform linux -- Python 3.9.18, pytest-8.3.3, pluggy-1.5.0 -- /home/omero/workspace/OMERO-test-integration/.venv3/bin/python3 cachedir: .pytest_cache django: version: 4.2.16, settings: omeroweb.settings (from ini) rootdir: /home/omero/workspace/OMERO-test-integration/src/components/tools configfile: pytest.ini plugins: xdist-3.6.1, mock-3.14.0, django-4.9.0 collecting ... collected 1 item test/integration/test_dbclient.py::TestDropBoxClient::test1 PASSED [100%] =============================== warnings summary =============================== ../../../../.venv3/lib64/python3.9/site-packages/Ice.py:14 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/Ice.py:14: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses import sys, string, imp, os, threading, warnings, datetime ../../../../.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:241 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:241: RemovedInDjango50Warning: The default value of USE_TZ will change from False to True in Django 5.0. Set USE_TZ to False in your project settings if you want to keep the current default behavior. warnings.warn( ../../../../.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:289 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:289: RemovedInDjango51Warning: The STATICFILES_STORAGE setting is deprecated. Use STORAGES instead. warnings.warn(STATICFILES_STORAGE_DEPRECATED_MSG, RemovedInDjango51Warning) ../../../../.venv3/lib64/python3.9/site-packages/pipeline/__init__.py:1 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/pipeline/__init__.py:1: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html from pkg_resources import DistributionNotFound, get_distribution -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroFS/target/reports/integration/junit-results.xml - ======================== 1 passed, 4 warnings in 0.21s ========================= BUILD SUCCESSFUL Total time: 7 seconds + /home/omero/workspace/OMERO-test-integration/src/build.py -f components/tools/OmeroWeb/build.xml integration -Dtestreports.dir=target/reports/integration OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0 Buildfile: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroWeb/build.xml Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroWeb... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroWeb... python-integration: Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroWeb/target/reports/integration ============================= test session starts ============================== platform linux -- Python 3.9.18, pytest-8.3.3, pluggy-1.5.0 -- /home/omero/workspace/OMERO-test-integration/.venv3/bin/python3 cachedir: .pytest_cache django: version: 4.2.16, settings: omeroweb.settings (from ini) rootdir: /home/omero/workspace/OMERO-test-integration/src/components/tools configfile: pytest.ini plugins: xdist-3.6.1, mock-3.14.0, django-4.9.0 collecting ... collected 430 items test/integration/test_annotate.py::TestMapAnnotations::test_annotate_map PASSED [ 0%] test/integration/test_annotate.py::TestTagging::test_create_tag PASSED [ 0%] test/integration/test_annotate.py::TestTagging::test_annotate_tag PASSED [ 0%] test/integration/test_annotate.py::TestBatchAnnotate::test_batch_annotate_tag PASSED [ 0%] test/integration/test_annotate.py::TestFileAnnotations::test_add_fileannotations_form PASSED [ 1%] test/integration/test_annotate.py::TestFileAnnotations::test_batch_add_fileannotations[1] PASSED [ 1%] test/integration/test_annotate.py::TestFileAnnotations::test_batch_add_fileannotations[2] PASSED [ 1%] test/integration/test_api_containers.py::TestContainers::test_create_update_unsupported[method0-Plate] PASSED [ 1%] test/integration/test_api_containers.py::TestContainers::test_create_update_unsupported[method0-Image] PASSED [ 2%] test/integration/test_api_containers.py::TestContainers::test_create_update_unsupported[method0-Well] PASSED [ 2%] test/integration/test_api_containers.py::TestContainers::test_create_update_unsupported[method0-Channel] PASSED [ 2%] test/integration/test_api_containers.py::TestContainers::test_create_update_unsupported[method0-foo] PASSED [ 2%] test/integration/test_api_containers.py::TestContainers::test_create_update_unsupported[method1-Plate] PASSED [ 3%] test/integration/test_api_containers.py::TestContainers::test_create_update_unsupported[method1-Image] PASSED [ 3%] test/integration/test_api_containers.py::TestContainers::test_create_update_unsupported[method1-Well] PASSED [ 3%] test/integration/test_api_containers.py::TestContainers::test_create_update_unsupported[method1-Channel] PASSED [ 3%] test/integration/test_api_containers.py::TestContainers::test_create_update_unsupported[method1-foo] PASSED [ 3%] test/integration/test_api_containers.py::TestContainers::test_delete_unsupported[Plate] PASSED [ 4%] test/integration/test_api_containers.py::TestContainers::test_delete_unsupported[Image] PASSED [ 4%] test/integration/test_api_containers.py::TestContainers::test_delete_unsupported[Well] PASSED [ 4%] test/integration/test_api_containers.py::TestContainers::test_container_crud[Project] PASSED [ 4%] test/integration/test_api_containers.py::TestContainers::test_container_crud[Dataset] PASSED [ 5%] test/integration/test_api_containers.py::TestContainers::test_container_crud[Screen] PASSED [ 5%] test/integration/test_api_containers.py::TestContainers::test_datasets_plates[Dataset-True] PASSED [ 5%] test/integration/test_api_containers.py::TestContainers::test_datasets_plates[Dataset-False] PASSED [ 5%] test/integration/test_api_containers.py::TestContainers::test_datasets_plates[Plate-True] PASSED [ 6%] test/integration/test_api_containers.py::TestContainers::test_datasets_plates[Plate-False] PASSED [ 6%] test/integration/test_api_containers.py::TestContainers::test_screens PASSED [ 6%] test/integration/test_api_containers.py::TestContainers::test_screen_plates_update PASSED [ 6%] test/integration/test_api_containers.py::TestContainers::test_container_tags_update[dtype0] PASSED [ 6%] test/integration/test_api_containers.py::TestContainers::test_container_tags_update[dtype1] PASSED [ 7%] test/integration/test_api_containers.py::TestContainers::test_container_tags_update[dtype2] PASSED [ 7%] test/integration/test_api_containers.py::TestContainers::test_spw_urls PASSED [ 7%] test/integration/test_api_containers.py::TestContainers::test_spw_parent_urls PASSED [ 7%] test/integration/test_api_containers.py::TestContainers::test_pdi_urls PASSED [ 8%] test/integration/test_api_containers.py::TestContainers::test_pdi_parent_urls PASSED [ 8%] test/integration/test_api_errors.py::TestErrors::test_save_post_no_id PASSED [ 8%] test/integration/test_api_errors.py::TestErrors::test_save_put_id PASSED [ 8%] test/integration/test_api_errors.py::TestErrors::test_marshal_type PASSED [ 9%] test/integration/test_api_errors.py::TestErrors::test_invalid_parameter PASSED [ 9%] test/integration/test_api_errors.py::TestErrors::test_marshal_validation PASSED [ 9%] test/integration/test_api_errors.py::TestErrors::test_security_violation PASSED [ 9%] test/integration/test_api_errors.py::TestErrors::test_validation_exception PASSED [ 10%] test/integration/test_api_errors.py::TestErrors::test_project_validation PASSED [ 10%] test/integration/test_api_experimenters_groups.py::TestExperimenters::test_create_update_unsupported[method0-Experimenter] PASSED [ 10%] test/integration/test_api_experimenters_groups.py::TestExperimenters::test_create_update_unsupported[method0-ExperimenterGroup] PASSED [ 10%] test/integration/test_api_experimenters_groups.py::TestExperimenters::test_create_update_unsupported[method1-Experimenter] PASSED [ 10%] test/integration/test_api_experimenters_groups.py::TestExperimenters::test_create_update_unsupported[method1-ExperimenterGroup] PASSED [ 11%] test/integration/test_api_experimenters_groups.py::TestExperimenters::test_delete_unsupported[Experimenter] PASSED [ 11%] test/integration/test_api_experimenters_groups.py::TestExperimenters::test_delete_unsupported[ExperimenterGroup] PASSED [ 11%] test/integration/test_api_experimenters_groups.py::TestExperimenters::test_experimenters_groups PASSED [ 11%] test/integration/test_api_experimenters_groups.py::TestExperimenters::test_groups_experimenters PASSED [ 12%] test/integration/test_api_experimenters_groups.py::TestExperimenters::test_filter_groups PASSED [ 12%] test/integration/test_api_experimenters_groups.py::TestExperimenters::test_filter_experimenters PASSED [ 12%] test/integration/test_api_images.py::TestImages::test_dataset_images PASSED [ 12%] test/integration/test_api_login.py::TestLogin::test_versions PASSED [ 13%] test/integration/test_api_login.py::TestLogin::test_base_url PASSED [ 13%] test/integration/test_api_login.py::TestLogin::test_base_url_versions_404 PASSED [ 13%] test/integration/test_api_login.py::TestLogin::test_login_get PASSED [ 13%] test/integration/test_api_login.py::TestLogin::test_login_csrf PASSED [ 13%] test/integration/test_api_login.py::TestLogin::test_login_errors[credentials0] PASSED [ 14%] test/integration/test_api_login.py::TestLogin::test_login_errors[credentials1] PASSED [ 14%] test/integration/test_api_login.py::TestLogin::test_login_errors[credentials2] PASSED [ 14%] test/integration/test_api_login.py::TestLogin::test_login_errors[credentials3] PASSED [ 14%] test/integration/test_api_login.py::TestLogin::test_login_example PASSED [ 15%] test/integration/test_api_projects.py::TestProjects::test_marshal_projects_not_logged_in PASSED [ 15%] test/integration/test_api_projects.py::TestProjects::test_marshal_projects_no_results PASSED [ 15%] test/integration/test_api_projects.py::TestProjects::test_marshal_projects_user PASSED [ 15%] test/integration/test_api_projects.py::TestProjects::test_marshal_projects_another_user PASSED [ 16%] test/integration/test_api_projects.py::TestProjects::test_marshal_projects_another_group PASSED [ 16%] test/integration/test_api_projects.py::TestProjects::test_marshal_projects_all_groups PASSED [ 16%] test/integration/test_api_projects.py::TestProjects::test_marshal_projects_all_users PASSED [ 16%] test/integration/test_api_projects.py::TestProjects::test_marshal_projects_pagination PASSED [ 16%] test/integration/test_api_projects.py::TestProjects::test_marshal_projects_params PASSED [ 17%] test/integration/test_api_projects.py::TestProjects::test_project_create_read PASSED [ 17%] test/integration/test_api_projects.py::TestProjects::test_project_create_other_group PASSED [ 17%] test/integration/test_api_projects.py::TestProjects::test_project_update PASSED [ 17%] test/integration/test_api_projects.py::TestProjects::test_project_datasets_update[Project] PASSED [ 18%] test/integration/test_api_projects.py::TestProjects::test_project_datasets_update[Dataset] PASSED [ 18%] test/integration/test_api_projects.py::TestProjects::test_project_delete PASSED [ 18%] test/integration/test_api_rois.py::TestContainers::test_image_rois PASSED [ 18%] test/integration/test_api_rois.py::TestContainers::test_roi_delete PASSED [ 19%] test/integration/test_api_rois.py::TestContainers::test_shapes PASSED [ 19%] test/integration/test_api_wells.py::TestWells::test_plate_wells PASSED [ 19%] test/integration/test_api_wells.py::TestWells::test_plate_index_wells PASSED [ 19%] test/integration/test_api_wells.py::TestWells::test_well PASSED [ 20%] test/integration/test_chgrp.py::TestChgrp::test_load_chgrp_groups[user] PASSED [ 20%] test/integration/test_chgrp.py::TestChgrp::test_load_chgrp_groups[admin] PASSED [ 20%] test/integration/test_chgrp.py::TestChgrp::test_chgrp_dry_run[user] PASSED [ 20%] test/integration/test_chgrp.py::TestChgrp::test_chgrp_new_container[user] PASSED [ 20%] test/integration/test_chgrp.py::TestChgrp::test_chgrp_new_container[admin] PASSED [ 21%] test/integration/test_chgrp.py::TestChgrp::test_chgrp_old_container[user] PASSED [ 21%] test/integration/test_chgrp.py::TestChgrp::test_chgrp_old_container[admin] PASSED [ 21%] test/integration/test_chown.py::TestChown::test_chown_dry_run[admin] PASSED [ 21%] test/integration/test_config.py::test_flattenProperties PASSED [ 22%] test/integration/test_config.py::TestConfig::testDefaultConfig PASSED [ 22%] test/integration/test_config.py::TestConfig::testDefaultConfigConversion PASSED [ 22%] test/integration/test_config.py::TestConfig::testUpgradeDropdownMenuConfig[foo-colleagues.label] PASSED [ 22%] test/integration/test_config.py::TestConfig::testUpgradeDropdownMenuConfig[foo-leaders.label] PASSED [ 23%] test/integration/test_config.py::TestConfig::testUpgradeDropdownMenuConfig[foo-everyone.label] PASSED [ 23%] test/integration/test_containers.py::TestContainers::test_add_and_rename_container PASSED [ 23%] test/integration/test_containers.py::TestContainers::test_add_owned_container PASSED [ 23%] test/integration/test_containers.py::TestContainers::test_paste_move_remove_deletamany_image PASSED [ 23%] test/integration/test_containers.py::TestContainers::test_edit_share PASSED [ 24%] test/integration/test_csrf.py::TestCsrf::test_csrf_middleware_enabled PASSED [ 24%] test/integration/test_csrf.py::TestCsrf::test_forgot_password PASSED [ 24%] test/integration/test_csrf.py::TestCsrf::test_move_data PASSED [ 24%] test/integration/test_csrf.py::TestCsrf::test_add_and_remove_comment PASSED [ 25%] test/integration/test_csrf.py::TestCsrf::test_attach_file PASSED [ 25%] test/integration/test_csrf.py::TestCsrf::test_edit_channel_names PASSED [ 25%] test/integration/test_csrf.py::TestCsrf::test_copy_past_rendering_settings PASSED [ 25%] test/integration/test_csrf.py::TestCsrf::test_reset_rendering_settings PASSED [ 26%] test/integration/test_csrf.py::TestCsrf::test_apply_owners_rendering_settings PASSED [ 26%] test/integration/test_csrf.py::TestCsrf::test_ome_tiff_script PASSED [ 26%] test/integration/test_csrf.py::TestCsrf::test_script PASSED [ 26%] test/integration/test_csrf.py::TestCsrf::test_myaccount PASSED [ 26%] test/integration/test_csrf.py::TestCsrf::test_avatar PASSED [ 27%] test/integration/test_csrf.py::TestCsrf::test_create_group PASSED [ 27%] test/integration/test_csrf.py::TestCsrf::test_create_user PASSED [ 27%] test/integration/test_csrf.py::TestCsrf::test_edit_group PASSED [ 27%] test/integration/test_csrf.py::TestCsrf::test_edit_user PASSED [ 28%] test/integration/test_csrf.py::TestCsrf::test_edit_group_by_owner PASSED [ 28%] test/integration/test_csrf.py::TestCsrf::test_change_password PASSED [ 28%] test/integration/test_csrf.py::TestCsrf::test_su PASSED [ 28%] test/integration/test_decorators.py::TestShow::test_conn_cleanup PASSED [ 29%] test/integration/test_download.py::TestDownload::test_spw_download PASSED [ 29%] test/integration/test_download.py::TestDownload::test_orphaned_image_direct_download PASSED [ 29%] test/integration/test_download.py::TestDownload::test_orphaned_image_download PASSED [ 29%] test/integration/test_download.py::TestDownload::test_image_in_dataset_download PASSED [ 30%] test/integration/test_download.py::TestDownload::test_image_in_dataset_in_project_download PASSED [ 30%] test/integration/test_download.py::TestDownload::test_well_download PASSED [ 30%] test/integration/test_download.py::TestDownload::test_attachment_download PASSED [ 30%] test/integration/test_download.py::TestDownloadAs::test_download_image_as[jpeg] PASSED [ 30%] test/integration/test_download.py::TestDownloadAs::test_download_image_as[png] PASSED [ 31%] test/integration/test_download.py::TestDownloadAs::test_download_image_as[tif] PASSED [ 31%] test/integration/test_download.py::TestDownloadAs::test_download_images_as_zip[jpeg] PASSED [ 31%] test/integration/test_download.py::TestDownloadAs::test_download_images_as_zip[png] PASSED [ 31%] test/integration/test_download.py::TestDownloadAs::test_download_images_as_zip[tif] PASSED [ 32%] test/integration/test_groups_users.py::TestGroupsUsers::test_group_users_menu PASSED [ 32%] test/integration/test_histogram.py::TestHistogram::test_histogram_bin_count[None] PASSED [ 32%] test/integration/test_histogram.py::TestHistogram::test_histogram_bin_count[10] PASSED [ 32%] test/integration/test_history.py::TestHistory::test_history PASSED [ 33%] test/integration/test_history.py::TestHistory::test_calendar_default PASSED [ 33%] test/integration/test_history.py::TestHistory::test_calendar_month PASSED [ 33%] test/integration/test_links.py::TestLinks::test_link_project_datasets PASSED [ 33%] test/integration/test_links.py::TestLinks::test_link_datasets_images PASSED [ 33%] test/integration/test_links.py::TestLinks::test_link_unlink_tagset_tags PASSED [ 34%] test/integration/test_links.py::TestLinks::test_unlink_screen_plate PASSED [ 34%] test/integration/test_login.py::TestLogin::test_login_errors[credentials0] PASSED [ 34%] test/integration/test_login.py::TestLogin::test_login_errors[credentials1] PASSED [ 34%] test/integration/test_login.py::TestLogin::test_login_errors[credentials2] PASSED [ 35%] test/integration/test_login.py::TestLogin::test_login_errors[credentials3] PASSED [ 35%] test/integration/test_login.py::TestLogin::test_get_login_page PASSED [ 35%] test/integration/test_login.py::TestLogin::test_login_redirect[] PASSED [ 35%] test/integration/test_login.py::TestLogin::test_login_redirect[/webclient/usertags/] PASSED [ 36%] test/integration/test_login.py::TestLogin::test_login_view PASSED [ 36%] test/integration/test_marshal.py::TestImgDetail::test_image_detail PASSED [ 36%] test/integration/test_metadata.py::TestCoreMetadata::test_pixel_size_units PASSED [ 36%] test/integration/test_metadata.py::TestCoreMetadata::test_none_pixel_size PASSED [ 36%] test/integration/test_metadata.py::TestBulkAnnotations::test_nsbulkannotations_file[True] PASSED [ 37%] test/integration/test_metadata.py::TestBulkAnnotations::test_nsbulkannotations_file[False] PASSED [ 37%] test/integration/test_metadata.py::TestBulkAnnotations::test_nsbulkannotations_not_file PASSED [ 37%] test/integration/test_plategrid.py::TestPlateGrid::test_get_plate_grid_metadata[shrink] PASSED [ 37%] test/integration/test_plategrid.py::TestPlateGrid::test_get_plate_grid_metadata[trim] PASSED [ 38%] test/integration/test_plategrid.py::TestPlateGrid::test_get_plate_grid_metadata[expand] PASSED [ 38%] test/integration/test_plategrid.py::TestPlateGrid::test_well_images PASSED [ 38%] test/integration/test_plategrid.py::TestPlateGrid::test_instantiation PASSED [ 38%] test/integration/test_plategrid.py::TestPlateGrid::test_metadata_grid_size PASSED [ 39%] test/integration/test_plategrid.py::TestPlateGrid::test_metadata_thumbnail_url PASSED [ 39%] test/integration/test_plategrid.py::TestPlateGrid::test_full_grid PASSED [ 39%] test/integration/test_plategrid.py::TestPlateGrid::test_acquisition_date PASSED [ 39%] test/integration/test_plategrid.py::TestPlateGrid::test_creation_date PASSED [ 40%] test/integration/test_plategrid.py::TestPlateGrid::test_description PASSED [ 40%] test/integration/test_plategrid.py::TestScreenPlateTables::test_get_plate_table PASSED [ 40%] test/integration/test_rendering.py::TestRendering::test_copy_past_rendering_settings_from_image PASSED [ 40%] test/integration/test_rendering.py::TestRendering::test_copy_past_rendering_settings_from_url PASSED [ 40%] test/integration/test_rendering.py::TestRendering::test_all_rendering_defs PASSED [ 41%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_incomplete_request PASSED [ 41%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_malformed_tile_argument PASSED [ 41%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_malformed_region_argument PASSED [ 41%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_tile_params PASSED [ 42%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_tile_params_large_image PASSED [ 42%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_tile_params_negative_resolution PASSED [ 42%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_tile_params_invalid_resolution PASSED [ 42%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_tile_params_big_image PASSED [ 43%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_region_params PASSED [ 43%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_region_params_big_image PASSED [ 43%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_birds_eye_view_big_image PASSED [ 43%] test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_big_image_resolution PASSED [ 43%] test/integration/test_scripts.py::TestScripts::test_script_ui_defaults PASSED [ 44%] test/integration/test_scripts.py::TestScripts::test_script_inputs_outputs[inputs0] PASSED [ 44%] test/integration/test_scripts.py::TestScripts::test_script_inputs_outputs[inputs1] PASSED [ 44%] test/integration/test_scripts.py::TestScripts::test_script_inputs_outputs[inputs2] PASSED [ 44%] test/integration/test_scripts.py::TestFigureScripts::test_figure_script_dialog[SplitView] PASSED [ 45%] test/integration/test_scripts.py::TestFigureScripts::test_figure_script_dialog[Thumbnail] PASSED [ 45%] test/integration/test_scripts.py::TestFigureScripts::test_figure_script_dialog[MakeMovie] PASSED [ 45%] test/integration/test_show.py::TestShow::test_empty_path PASSED [ 45%] test/integration/test_show.py::TestShow::test_project_legacy_path PASSED [ 46%] test/integration/test_show.py::TestShow::test_projects_legacy_show PASSED [ 46%] test/integration/test_show.py::TestShow::test_project_dataset_legacy_path PASSED [ 46%] test/integration/test_show.py::TestShow::test_project_dataset_image_legacy_path PASSED [ 46%] test/integration/test_show.py::TestShow::test_tag_redirect[1] PASSED [ 46%] test/integration/test_show.py::TestShow::test_tag_redirect[2] PASSED [ 47%] test/integration/test_show.py::TestShow::test_tagset_redirect[1] PASSED [ 47%] test/integration/test_show.py::TestShow::test_tagset_redirect[2] PASSED [ 47%] test/integration/test_show.py::TestShow::test_tag_legacy_path[1] PASSED [ 47%] test/integration/test_show.py::TestShow::test_tag_legacy_path[2] PASSED [ 48%] test/integration/test_show.py::TestShow::test_tagset_tag_legacy_path PASSED [ 48%] test/integration/test_show.py::TestShow::test_image_legacy_path PASSED [ 48%] test/integration/test_show.py::TestShow::test_screen_legacy_path PASSED [ 48%] test/integration/test_show.py::TestShow::test_screen_plate_legacy_path PASSED [ 49%] test/integration/test_show.py::TestShow::test_screen_plate_well_show PASSED [ 49%] test/integration/test_show.py::TestShow::test_screen_plate_run_well_show[0] PASSED [ 49%] test/integration/test_show.py::TestShow::test_screen_plate_run_well_show[1] PASSED [ 49%] test/integration/test_show.py::TestShow::test_project_dataset_image_show PASSED [ 50%] test/integration/test_show.py::TestShow::test_project_dataset_image_roi_show PASSED [ 50%] test/integration/test_show.py::TestShow::test_project_by_id PASSED [ 50%] test/integration/test_show.py::TestShow::test_project_by_name PASSED [ 50%] test/integration/test_show.py::TestShow::test_tag_by_value[1] PASSED [ 50%] test/integration/test_show.py::TestShow::test_tag_by_value[2] PASSED [ 51%] test/integration/test_show.py::TestShow::test_tagset_tag_by_id PASSED [ 51%] test/integration/test_show.py::TestShow::test_multiple_well_by_id[plate.name=%(plate_name)s|well.name=%(well_name)s-A10] PASSED [ 51%] test/integration/test_show.py::TestShow::test_multiple_well_by_id[plate.name=%(plate_name)s|well.name=%(well_name)s-1J] PASSED [ 51%] test/integration/test_show.py::TestShow::test_screen_plate_run_well_by_name[plate.name=%(plate_name)s|well.name=%(well_name)s-A10] PASSED [ 52%] test/integration/test_show.py::TestShow::test_screen_plate_run_well_by_name[plate.name=%(plate_name)s|well.name=%(well_name)s-1J] PASSED [ 52%] test/integration/test_show.py::TestShow::test_multiple_well_by_id[run=%(plate_acquisition_id)s|well.name=%(well_name)s-A10] PASSED [ 52%] test/integration/test_show.py::TestShow::test_multiple_well_by_id[run=%(plate_acquisition_id)s|well.name=%(well_name)s-1J] PASSED [ 52%] test/integration/test_show.py::TestShow::test_screen_plate_run_well_by_name[run=%(plate_acquisition_id)s|well.name=%(well_name)s-A10] PASSED [ 53%] test/integration/test_show.py::TestShow::test_screen_plate_run_well_by_name[run=%(plate_acquisition_id)s|well.name=%(well_name)s-1J] PASSED [ 53%] test/integration/test_show.py::TestShow::test_well_by_name[A10] PASSED [ 53%] test/integration/test_show.py::TestShow::test_well_by_name[1J] PASSED [ 53%] test/integration/test_show.py::TestShow::test_screen_plate_run_illegal_run PASSED [ 53%] test/integration/test_show.py::TestShow::test_path_to_no_objects PASSED [ 54%] test/integration/test_show.py::TestShow::test_project_dataset_image PASSED [ 54%] test/integration/test_show.py::TestShow::test_project_dataset_images_pagination PASSED [ 54%] test/integration/test_show.py::TestShow::test_orphaned_pagination PASSED [ 54%] test/integration/test_show.py::TestShow::test_get_image_ids_dataset PASSED [ 55%] test/integration/test_show.py::TestShow::test_image PASSED [ 55%] test/integration/test_show.py::TestShow::test_roi PASSED [ 55%] test/integration/test_show.py::TestShow::test_shape PASSED [ 55%] test/integration/test_show.py::TestShow::test_well_image_shape PASSED [ 56%] test/integration/test_show.py::TestShow::test_image_orphan PASSED [ 56%] test/integration/test_show.py::TestShow::test_image_multi_link PASSED [ 56%] test/integration/test_show.py::TestShow::test_image_multi_link_restrict_dataset PASSED [ 56%] test/integration/test_show.py::TestShow::test_image_multi_link_restrict_dataset_project PASSED [ 56%] test/integration/test_show.py::TestShow::test_image_multi_link_restrict_project PASSED [ 57%] test/integration/test_show.py::TestShow::test_dataset PASSED [ 57%] test/integration/test_show.py::TestShow::test_dataset_multi_link PASSED [ 57%] test/integration/test_show.py::TestShow::test_dataset_multi_link_restrict_project PASSED [ 57%] test/integration/test_show.py::TestShow::test_dataset_orphan PASSED [ 58%] test/integration/test_show.py::TestShow::test_project PASSED [ 58%] test/integration/test_show.py::TestShow::test_acquisition PASSED [ 58%] test/integration/test_show.py::TestShow::test_acquisition_restrict_plate PASSED [ 58%] test/integration/test_show.py::TestShow::test_acquisition_restrict_screen PASSED [ 59%] test/integration/test_show.py::TestShow::test_acquisition_restrict_plate_screen PASSED [ 59%] test/integration/test_show.py::TestShow::test_well PASSED [ 59%] test/integration/test_show.py::TestShow::test_well_multi PASSED [ 59%] test/integration/test_show.py::TestShow::test_well_image PASSED [ 60%] test/integration/test_show.py::TestShow::test_well_restrict_acquisition_multi PASSED [ 60%] test/integration/test_show.py::TestShow::test_well_restrict_plate_multi PASSED [ 60%] test/integration/test_show.py::TestShow::test_well_restrict_screen_multi PASSED [ 60%] test/integration/test_show.py::TestShow::test_well_restrict_acquisition_plate_multi PASSED [ 60%] test/integration/test_show.py::TestShow::test_well_restrict_acquisition_screen_multi PASSED [ 61%] test/integration/test_show.py::TestShow::test_well_restrict_acquisition_plate_screen_multi PASSED [ 61%] test/integration/test_show.py::TestShow::test_well_restrict_plate_screen_multi PASSED [ 61%] test/integration/test_show.py::TestShow::test_plate PASSED [ 61%] test/integration/test_show.py::TestShow::test_plate_restrict_screen PASSED [ 62%] test/integration/test_show.py::TestShow::test_screen PASSED [ 62%] test/integration/test_simple.py::TestSimple::testCurrentUser PASSED [ 62%] test/integration/test_simple.py::TestSimple::testImport PASSED [ 62%] test/integration/test_table.py::TestOmeroTables::test_table_html PASSED [ 63%] test/integration/test_table.py::TestOmeroTables::test_table_pagination PASSED [ 63%] test/integration/test_table.py::TestOmeroTables::test_table_query PASSED [ 63%] test/integration/test_table.py::TestOmeroTables::test_table_bitmask[query_result0] PASSED [ 63%] test/integration/test_table.py::TestOmeroTables::test_table_bitmask[query_result1] PASSED [ 63%] test/integration/test_table.py::TestOmeroTables::test_table_bitmask[query_result2] PASSED [ 64%] test/integration/test_table.py::TestOmeroTables::test_table_bitmask[query_result3] PASSED [ 64%] test/integration/test_table.py::TestOmeroTables::test_table_bitmask[query_result4] PASSED [ 64%] test/integration/test_table.py::TestOmeroTables::test_table_bitmask[query_result5] PASSED [ 64%] test/integration/test_table.py::TestOmeroTables::test_table_metadata PASSED [ 65%] test/integration/test_table.py::TestOmeroTables::test_table_get_where_list[query_result0] PASSED [ 65%] test/integration/test_table.py::TestOmeroTables::test_table_get_where_list[query_result1] PASSED [ 65%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice[query_result0] PASSED [ 65%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice[query_result1] PASSED [ 66%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice[query_result2] PASSED [ 66%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice_errors[query_validity0] PASSED [ 66%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice_errors[query_validity1] PASSED [ 66%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice_errors[query_validity2] PASSED [ 66%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice_errors[query_validity3] PASSED [ 67%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice_errors[query_validity4] PASSED [ 67%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice_errors[query_validity5] PASSED [ 67%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice_errors[query_validity6] PASSED [ 67%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice_errors[query_validity7] PASSED [ 68%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice_errors[query_validity8] PASSED [ 68%] test/integration/test_table.py::TestOmeroTables::test_table_perform_slice_errors[query_validity9] PASSED [ 68%] test/integration/test_tags.py::TestTags::test_create_tag_and_tagset PASSED [ 68%] test/integration/test_tags.py::TestTags::test_edit_tag_and_tagset[tagset] PASSED [ 69%] test/integration/test_tags.py::TestTags::test_edit_tag_and_tagset[tag] PASSED [ 69%] test/integration/test_tags.py::TestTags::test_add_edit_and_remove_tag PASSED [ 69%] test/integration/test_tags.py::TestTags::test_add_remove_tags PASSED [ 69%] test/integration/test_thumbnails.py::TestThumbnails::test_default_thumb_size[None] PASSED [ 70%] test/integration/test_thumbnails.py::TestThumbnails::test_default_thumb_size[100] PASSED [ 70%] test/integration/test_thumbnails.py::TestThumbnails::test_base64_thumb[None] PASSED [ 70%] test/integration/test_thumbnails.py::TestThumbnails::test_base64_thumb[100] PASSED [ 70%] test/integration/test_thumbnails.py::TestThumbnails::test_base64_thumb_set PASSED [ 70%] test/integration/test_thumbnails.py::TestRoiThumbnails::test_roi_thumbnail[0-1] PASSED [ 71%] test/integration/test_thumbnails.py::TestRoiThumbnails::test_roi_thumbnail[0-0] PASSED [ 71%] test/integration/test_thumbnails.py::TestRoiThumbnails::test_roi_thumbnail[1-1] PASSED [ 71%] test/integration/test_thumbnails.py::TestRoiThumbnails::test_roi_thumbnail[1-0] PASSED [ 71%] test/integration/test_tree.py::TestTree::test_marshal_experimenter PASSED [ 72%] test/integration/test_tree.py::TestTree::test_marshal_projects_no_results PASSED [ 72%] test/integration/test_tree.py::TestTree::test_marshal_projects_user PASSED [ 72%] test/integration/test_tree.py::TestTree::test_marshal_projects_another_user PASSED [ 72%] test/integration/test_tree.py::TestTree::test_marshal_projects_another_group PASSED [ 73%] test/integration/test_tree.py::TestTree::test_marshal_projects_all_groups PASSED [ 73%] test/integration/test_tree.py::TestTree::test_marshal_projects_all_users PASSED [ 73%] test/integration/test_tree.py::TestTree::test_marshal_projects_all_groups_all_users PASSED [ 73%] test/integration/test_tree.py::TestTree::test_marshal_datasets_no_results PASSED [ 73%] test/integration/test_tree.py::TestTree::test_marshal_datasets_user PASSED [ 74%] test/integration/test_tree.py::TestTree::test_marshal_datasets_another_user PASSED [ 74%] test/integration/test_tree.py::TestTree::test_marshal_datasets_another_group PASSED [ 74%] test/integration/test_tree.py::TestTree::test_marshal_datasets_all_groups PASSED [ 74%] test/integration/test_tree.py::TestTree::test_marshal_datasets_all_users PASSED [ 75%] test/integration/test_tree.py::TestTree::test_marshal_datasets_all_groups_all_users PASSED [ 75%] test/integration/test_tree.py::TestTree::test_marshal_datasets_project PASSED [ 75%] test/integration/test_tree.py::TestTree::test_marshal_datasets_project_crosslink PASSED [ 75%] test/integration/test_tree.py::TestTree::test_marshal_images_no_results PASSED [ 76%] test/integration/test_tree.py::TestTree::test_marshal_images_user PASSED [ 76%] test/integration/test_tree.py::TestTree::test_marshal_images_user_pixels PASSED [ 76%] test/integration/test_tree.py::TestTree::test_marshal_images_thumb_version[thumb0] PASSED [ 76%] test/integration/test_tree.py::TestTree::test_marshal_images_thumb_version[thumb1] PASSED [ 76%] test/integration/test_tree.py::TestTree::test_marshal_images_another_user PASSED [ 77%] test/integration/test_tree.py::TestTree::test_marshal_images_another_group PASSED [ 77%] test/integration/test_tree.py::TestTree::test_marshal_images_all_groups PASSED [ 77%] test/integration/test_tree.py::TestTree::test_marshal_images_all_users PASSED [ 77%] test/integration/test_tree.py::TestTree::test_marshal_images_all_groups_all_users PASSED [ 78%] test/integration/test_tree.py::TestTree::test_marshal_images_dataset PASSED [ 78%] test/integration/test_tree.py::TestTree::test_marshal_images_dataset_no_pixels PASSED [ 78%] test/integration/test_tree.py::TestTree::test_marshal_images_dataset_date PASSED [ 78%] test/integration/test_tree.py::TestTree::test_marshal_images_dataset_crosslink PASSED [ 79%] test/integration/test_tree.py::TestTree::test_marshal_images_share PASSED [ 79%] test/integration/test_tree.py::TestTree::test_marshal_screens_no_results PASSED [ 79%] test/integration/test_tree.py::TestTree::test_marshal_screens_user PASSED [ 79%] test/integration/test_tree.py::TestTree::test_marshal_screens_another_user PASSED [ 80%] test/integration/test_tree.py::TestTree::test_marshal_screens_another_group PASSED [ 80%] test/integration/test_tree.py::TestTree::test_marshal_screens_all_groups PASSED [ 80%] test/integration/test_tree.py::TestTree::test_marshal_screens_all_users PASSED [ 80%] test/integration/test_tree.py::TestTree::test_marshal_screens_all_groups_all_users PASSED [ 80%] test/integration/test_tree.py::TestTree::test_marshal_plates_no_results PASSED [ 81%] test/integration/test_tree.py::TestTree::test_marshal_plates_user PASSED [ 81%] test/integration/test_tree.py::TestTree::test_marshal_plates_another_user PASSED [ 81%] test/integration/test_tree.py::TestTree::test_marshal_plates_another_group PASSED [ 81%] test/integration/test_tree.py::TestTree::test_marshal_plates_all_groups PASSED [ 82%] test/integration/test_tree.py::TestTree::test_marshal_plates_all_users PASSED [ 82%] test/integration/test_tree.py::TestTree::test_marshal_plates_all_groups_all_users PASSED [ 82%] test/integration/test_tree.py::TestTree::test_marshal_plate_acquisitions_no_results PASSED [ 82%] test/integration/test_tree.py::TestTree::test_marshal_plate_acquisitions_user PASSED [ 83%] test/integration/test_tree.py::TestTree::test_marshal_plate_acquisitions_another_user PASSED [ 83%] test/integration/test_tree.py::TestTree::test_marshal_plate_acquisitions_another_group PASSED [ 83%] test/integration/test_tree.py::TestTree::test_marshal_orphaned_no_results PASSED [ 83%] test/integration/test_tree.py::TestTree::test_marshal_orphaned PASSED [ 83%] test/integration/test_tree.py::TestTree::test_marshal_orphaned_another_user PASSED [ 84%] test/integration/test_tree.py::TestTree::test_marshal_orphaned_another_group PASSED [ 84%] test/integration/test_tree.py::TestTree::test_marshal_orphaned_all_groups PASSED [ 84%] test/integration/test_tree.py::TestTree::test_marshal_tags_no_results PASSED [ 84%] test/integration/test_tree.py::TestTree::test_marshal_tags_user PASSED [ 85%] test/integration/test_tree.py::TestTree::test_marshal_tags_another_user PASSED [ 85%] test/integration/test_tree.py::TestTree::test_marshal_tags_another_group PASSED [ 85%] test/integration/test_tree.py::TestTree::test_marshal_tags_all_groups PASSED [ 85%] test/integration/test_tree.py::TestTree::test_marshal_tags_all_users PASSED [ 86%] test/integration/test_tree.py::TestTree::test_marshal_tags_all_groups_all_users PASSED [ 86%] test/integration/test_tree.py::TestTree::test_marshal_tags_orphaned PASSED [ 86%] test/integration/test_tree.py::TestTree::test_marshal_tags_tagset PASSED [ 86%] test/integration/test_tree.py::TestTree::test_marshal_tagged_no_results PASSED [ 86%] test/integration/test_tree.py::TestTree::test_marshal_tagged_user PASSED [ 87%] test/integration/test_tree.py::TestTree::test_marshal_tagged_perms PASSED [ 87%] test/integration/test_tree.py::TestTree::test_marshal_tagged_image_pixels PASSED [ 87%] test/integration/test_tree.py::TestTree::test_marshal_shares_user PASSED [ 87%] test/integration/test_tree.py::TestTree::test_marshal_shares_another_user PASSED [ 88%] test/integration/test_tree.py::TestTree::test_marshal_shares_user_owned PASSED [ 88%] test/integration/test_tree.py::TestTree::test_marshal_shares_another_user_owned PASSED [ 88%] test/integration/test_tree.py::TestTree::test_marshal_discussions_user PASSED [ 88%] test/integration/test_tree.py::TestTree::test_marshal_discussions_another_user PASSED [ 89%] test/integration/test_tree.py::TestTree::test_marshal_discussions_user_owned PASSED [ 89%] test/integration/test_tree.py::TestTree::test_marshal_discussions_another_user_owned PASSED [ 89%] test/integration/test_tree_annotations.py::TestTreeAnnotations::test_single_tag PASSED [ 89%] test/integration/test_tree_annotations.py::TestTreeAnnotations::test_single_tag_userB PASSED [ 90%] test/integration/test_tree_annotations.py::TestTreeAnnotations::test_twin_tags_userA_userB PASSED [ 90%] test/integration/test_tree_annotations.py::TestTreeAnnotations::test_twin_tags_projects PASSED [ 90%] test/integration/test_tree_annotations.py::TestTreeAnnotations::test_tags_comments_project PASSED [ 90%] test/integration/test_tree_annotations.py::TestTreeAnnotations::test_filter_by_namespace PASSED [ 90%] test/integration/test_webadmin.py::TestUserSettings::test_user_settings_page PASSED [ 91%] test/integration/test_webadmin.py::TestUserSettings::test_edit_settings PASSED [ 91%] test/integration/test_webadmin.py::TestExperimenters::test_create_experimenter_roles[user] PASSED [ 91%] test/integration/test_webadmin.py::TestExperimenters::test_create_experimenter_roles[restricted_administrator] PASSED [ 91%] test/integration/test_webadmin.py::TestExperimenters::test_create_experimenter_roles[administrator] PASSED [ 92%] test/integration/test_webadmin.py::TestExperimenters::test_required_fields[required_field0] PASSED [ 92%] test/integration/test_webadmin.py::TestExperimenters::test_required_fields[required_field1] PASSED [ 92%] test/integration/test_webadmin.py::TestExperimenters::test_required_fields[required_field2] PASSED [ 92%] test/integration/test_webadmin.py::TestExperimenters::test_required_fields[required_field3] PASSED [ 93%] test/integration/test_webadmin.py::TestExperimenters::test_required_fields[required_field4] PASSED [ 93%] test/integration/test_webadmin.py::TestExperimenters::test_required_fields[required_field5] PASSED [ 93%] test/integration/test_webadmin.py::TestExperimenters::test_required_fields[required_field6] PASSED [ 93%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin[privilege0] PASSED [ 93%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin[privilege1] PASSED [ 94%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin[privilege2] PASSED [ 94%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin[privilege3] PASSED [ 94%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin[privilege4] PASSED [ 94%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin[privilege5] PASSED [ 95%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin[privilege6] PASSED [ 95%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin[privilege7] PASSED [ 95%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin[privilege8] PASSED [ 95%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin[privilege9] PASSED [ 96%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin_form[privileges0] PASSED [ 96%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin_form[privileges1] PASSED [ 96%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin_form[privileges2] PASSED [ 96%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin_form[privileges3] PASSED [ 96%] test/integration/test_webadmin.py::TestExperimenters::test_create_restricted_admin_form[privileges4] PASSED [ 97%] test/integration/test_webadmin.py::TestExperimenters::test_restricted_admin_create_edit_user PASSED [ 97%] test/integration/test_webadmin.py::TestGroups::test_new_group_form[privileges0] PASSED [ 97%] test/integration/test_webadmin.py::TestGroups::test_new_group_form[privileges1] PASSED [ 97%] test/integration/test_webadmin.py::TestGroups::test_new_group_form[privileges2] PASSED [ 98%] test/integration/test_webadmin.py::TestGroups::test_create_group_permissions[permissions0] PASSED [ 98%] test/integration/test_webadmin.py::TestGroups::test_create_group_permissions[permissions1] PASSED [ 98%] test/integration/test_webadmin.py::TestGroups::test_create_group_permissions[permissions2] PASSED [ 98%] test/integration/test_webadmin.py::TestGroups::test_create_group_permissions[permissions3] PASSED [ 99%] test/integration/test_webadmin.py::TestGroups::test_required_fields[required_field0] PASSED [ 99%] test/integration/test_webadmin.py::TestGroups::test_required_fields[required_field1] PASSED [ 99%] test/integration/test_webadmin.py::TestGroups::test_validation_errors PASSED [ 99%] test/integration/test_webadmin.py::TestGroups::test_save_experimenter PASSED [100%] =============================== warnings summary =============================== ../../../../.venv3/lib64/python3.9/site-packages/Ice.py:14 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/Ice.py:14: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses import sys, string, imp, os, threading, warnings, datetime ../../../../.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:241 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:241: RemovedInDjango50Warning: The default value of USE_TZ will change from False to True in Django 5.0. Set USE_TZ to False in your project settings if you want to keep the current default behavior. warnings.warn( ../../../../.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:289 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:289: RemovedInDjango51Warning: The STATICFILES_STORAGE setting is deprecated. Use STORAGES instead. warnings.warn(STATICFILES_STORAGE_DEPRECATED_MSG, RemovedInDjango51Warning) ../../../../.venv3/lib64/python3.9/site-packages/pipeline/__init__.py:1 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/pipeline/__init__.py:1: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html from pkg_resources import DistributionNotFound, get_distribution OmeroWeb/test/integration/test_annotate.py: 69 warnings OmeroWeb/test/integration/test_api_containers.py: 177 warnings OmeroWeb/test/integration/test_api_errors.py: 19 warnings OmeroWeb/test/integration/test_api_experimenters_groups.py: 57 warnings OmeroWeb/test/integration/test_api_images.py: 13 warnings OmeroWeb/test/integration/test_api_login.py: 8 warnings OmeroWeb/test/integration/test_api_projects.py: 87 warnings OmeroWeb/test/integration/test_api_rois.py: 25 warnings OmeroWeb/test/integration/test_api_wells.py: 20 warnings OmeroWeb/test/integration/test_chgrp.py: 43 warnings OmeroWeb/test/integration/test_chown.py: 16 warnings OmeroWeb/test/integration/test_config.py: 5 warnings OmeroWeb/test/integration/test_containers.py: 19 warnings OmeroWeb/test/integration/test_csrf.py: 38 warnings OmeroWeb/test/integration/test_decorators.py: 5 warnings OmeroWeb/test/integration/test_download.py: 28 warnings OmeroWeb/test/integration/test_groups_users.py: 10 warnings OmeroWeb/test/integration/test_histogram.py: 7 warnings OmeroWeb/test/integration/test_history.py: 12 warnings OmeroWeb/test/integration/test_links.py: 20 warnings OmeroWeb/test/integration/test_login.py: 13 warnings OmeroWeb/test/integration/test_marshal.py: 6 warnings OmeroWeb/test/integration/test_metadata.py: 16 warnings OmeroWeb/test/integration/test_plategrid.py: 29 warnings OmeroWeb/test/integration/test_rendering.py: 66 warnings OmeroWeb/test/integration/test_scripts.py: 38 warnings OmeroWeb/test/integration/test_show.py: 4 warnings OmeroWeb/test/integration/test_table.py: 110 warnings OmeroWeb/test/integration/test_tags.py: 26 warnings OmeroWeb/test/integration/test_thumbnails.py: 68 warnings OmeroWeb/test/integration/test_webadmin.py: 104 warnings /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/django/core/serializers/base.py:22: RemovedInDjango50Warning: PickleSerializer is deprecated due to its security risk. Use JSONSerializer instead. warnings.warn( OmeroWeb/test/integration/test_api_containers.py: 67 warnings OmeroWeb/test/integration/test_api_images.py: 6 warnings OmeroWeb/test/integration/test_api_wells.py: 24 warnings OmeroWeb/test/integration/test_containers.py: 2 warnings OmeroWeb/test/integration/test_csrf.py: 5 warnings OmeroWeb/test/integration/test_download.py: 9 warnings OmeroWeb/test/integration/test_histogram.py: 2 warnings OmeroWeb/test/integration/test_metadata.py: 2 warnings OmeroWeb/test/integration/test_rendering.py: 17 warnings OmeroWeb/test/integration/test_scripts.py: 30 warnings OmeroWeb/test/integration/test_thumbnails.py: 23 warnings OmeroWeb/test/integration/test_tree.py: 20 warnings /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/script_utils.py:1093: DeprecationWarning: tostring() is deprecated. Use tobytes() instead. converted_plane = byte_swapped_plane.tostring() OmeroWeb/test/integration/test_api_errors.py::TestErrors::test_marshal_type /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_marshal/__init__.py:44: DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead logger.warn('Requested unknown decoder %s' % t, exc_info=True) OmeroWeb/test/integration/test_api_errors.py::TestErrors::test_marshal_validation /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_marshal/__init__.py:97: DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead logger.warn('Requested unknown decoder %s' % t, exc_info=True) OmeroWeb/test/integration/test_api_errors.py::TestErrors::test_security_violation OmeroWeb/test/integration/test_api_errors.py::TestErrors::test_validation_exception OmeroWeb/test/integration/test_api_errors.py::TestErrors::test_project_validation OmeroWeb/test/integration/test_api_projects.py::TestProjects::test_project_delete OmeroWeb/test/integration/test_api_rois.py::TestContainers::test_roi_delete OmeroWeb/test/integration/test_links.py::TestLinks::test_link_datasets_images OmeroWeb/test/integration/test_links.py::TestLinks::test_link_datasets_images /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:4810: DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead logger.warn("%s on %s to <%s> %s(%r, %r)", OmeroWeb/test/integration/test_metadata.py::TestBulkAnnotations::test_nsbulkannotations_file[True] OmeroWeb/test/integration/test_metadata.py::TestBulkAnnotations::test_nsbulkannotations_file[False] OmeroWeb/test/integration/test_metadata.py::TestBulkAnnotations::test_nsbulkannotations_not_file /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omeroweb/testlib/__init__.py:478: DeprecationWarning: This method is deprecated as of OMERO 5.4.0. Use get warnings.warn( OmeroWeb/test/integration/test_rendering.py::TestRendering::test_copy_past_rendering_settings_from_url OmeroWeb/test/integration/test_rendering.py::TestRendering::test_copy_past_rendering_settings_from_url OmeroWeb/test/integration/test_rendering.py::TestRendering::test_copy_past_rendering_settings_from_url OmeroWeb/test/integration/test_rendering.py::TestRenderImageRegion::test_render_image_region_incomplete_request /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:8856: DeprecationWarning: setActiveChannels() is deprecated in OMERO 5.4.0.Use set_active_channels warnings.warn("setActiveChannels() is deprecated in OMERO 5.4.0." OmeroWeb/test/integration/test_rendering.py::TestRendering::test_copy_past_rendering_settings_from_url OmeroWeb/test/integration/test_rendering.py::TestRendering::test_copy_past_rendering_settings_from_url OmeroWeb/test/integration/test_rendering.py::TestRendering::test_copy_past_rendering_settings_from_url OmeroWeb/test/integration/test_rendering.py::TestRendering::test_copy_past_rendering_settings_from_url /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:9118: DeprecationWarning: Deprecated in 5.4.0. Use setChannelInverted() warnings.warn("Deprecated in 5.4.0. Use setChannelInverted()", -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroWeb/target/reports/integration/junit-results.xml - =============== 430 passed, 1389 warnings in 2636.08s (0:43:56) ================ !! 10/24/24 08:12:00.012 error: 8 communicators not destroyed during global destruction. BUILD SUCCESSFUL Total time: 43 minutes 59 seconds + /home/omero/workspace/OMERO-test-integration/src/build.py -f components/tools/OmeroPy/build.xml integration -DMARK=broken -Dtestreports.dir=target/reports/broken OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0 Buildfile: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/build.xml Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy... Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy... python-integration: Created dir: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/target/reports/broken ============================= test session starts ============================== platform linux -- Python 3.9.18, pytest-8.3.3, pluggy-1.5.0 -- /home/omero/workspace/OMERO-test-integration/.venv3/bin/python3 cachedir: .pytest_cache django: version: 4.2.16, settings: omeroweb.settings (from ini) rootdir: /home/omero/workspace/OMERO-test-integration/src/components/tools configfile: pytest.ini plugins: xdist-3.6.1, mock-3.14.0, django-4.9.0 collecting ... collected 2061 items / 2025 deselected / 36 selected test/integration/clitest/test_fs.py::TestFS::testRenameAdminOnly FAILED [ 2%] test/integration/clitest/test_import.py::TestImport::testTargetInDifferentGroup[Dataset-test.fake--d] FAILED [ 5%] test/integration/clitest/test_import.py::TestImport::testTargetInDifferentGroup[Screen-SPW&plates=1&plateRows=1&plateCols=1&fields=1&plateAcqs=1.fake--r] FAILED [ 8%] test/integration/gatewaytest/test_multi_group.py::TestHistory::testCreateHistory FAILED [ 11%] test/integration/gatewaytest/test_multi_group.py::TestScript::testRunScript FAILED [ 13%] test/integration/gatewaytest/test_performance.py::TestPerformance::testListFileAnnotations FAILED [ 16%] test/integration/gatewaytest/test_user.py::TestUser::testCrossGroupRead FAILED [ 19%] test/integration/scriptstest/test_ping.py::TestPing::testProcessCallback FAILED [ 22%] test/integration/scriptstest/test_repo.py::TestScriptRepo::testGetGroupScripts FAILED [ 25%] test/integration/tablestest/test_service.py::TestTables::test2098 FAILED [ 27%] test/integration/tablestest/test_service.py::TestTables::testReadOnlyFile FAILED [ 30%] test/integration/tablestest/test_service.py::TestTables::testReadEqual FAILED [ 33%] test/integration/tablestest/test_service.py::TestTables::testReadOutOfRange FAILED [ 36%] test/integration/test_admin.py::TestAdmin::testChangePasswordWhenUnset FAILED [ 38%] test/integration/test_admin.py::TestAdmin::test9193 FAILED [ 41%] test/integration/test_files.py::TestFiles::testUploadDifferentSizeTicket2337 FAILED [ 44%] test/integration/test_ishare.py::TestIShare::test1172 FAILED [ 47%] test/integration/test_itimeline.py::TestITimeline::test1225 FAILED [ 50%] test/integration/test_permissions.py::TestPermissions::test3136 FAILED [ 52%] test/integration/test_permissions.py::TestPermissions::testSaveWithNegOneExplicit FAILED [ 55%] test/integration/test_permissions.py::TestPermissions::testSaveWithNegBadLink FAILED [ 58%] test/integration/test_permissions.py::TestPermissions::testSaveBadLink FAILED [ 61%] test/integration/test_permissions.py::TestPermissions::testUseOfRawFileBeanScriptReadCorrectGroupAndUser FAILED [ 63%] test/integration/test_rawfilestore.py::TestRFS::testTicket1961Basic FAILED [ 66%] test/integration/test_rawfilestore.py::TestRFS::testTicket1961WithKillSession FAILED [ 69%] test/integration/test_rawfilestore.py::TestRFS::testTicket2161Save FAILED [ 72%] test/integration/test_rawfilestore.py::TestRFS::testNoWrite FAILED [ 75%] test/integration/test_reporawfilestore.py::TestRepoRawFileStore::testFailedWriteNoFile FAILED [ 77%] test/integration/test_scripts.py::TestScripts::testParseErrorTicket2185 PASSED [ 80%] test/integration/test_scripts.py::TestScripts::testAutoFillTicket2326 FAILED [ 83%] test/integration/test_scripts.py::TestScripts::testParamLoadingPerformanceTicket2285 FAILED [ 86%] test/integration/test_scripts.py::TestScripts::test3527 FAILED [ 88%] test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testPrivate10618RootWithNoCtx FAILED [ 91%] test/integration/test_tickets2000.py::TestTickets2000::test1184 FAILED [ 94%] test/integration/test_tickets4000.py::TestTickets4000::test3138 PASSED [ 97%] test/integration/test_tickets6000.py::TestTickets6000::test5684 PASSED [100%] =================================== FAILURES =================================== __________________________ TestFS.testRenameAdminOnly __________________________ self = capsys = <_pytest.capture.CaptureFixture object at 0x7f7a0cfd76d0> @pytest.mark.broken(reason="fs rename is temporarily disabled") def testRenameAdminOnly(self, capsys): """Test fs rename is admin-only""" self.args += ["rename", "Fileset:1"] with pytest.raises(NonZeroReturnCode): self.cli.invoke(self.args, strict=True) out, err = capsys.readouterr() > assert err.endswith("SecurityViolation: Admins only!\n") E AssertionError: assert False E + where False = ('SecurityViolation: Admins only!\n') E + where = 'disabled since OMERO 5.4.7 due to Pixels.path bug\n'.endswith test/integration/clitest/test_fs.py:118: AssertionError _________ TestImport.testTargetInDifferentGroup[Dataset-test.fake--d] __________ self = container = 'Dataset', filename = 'test.fake', arg = '-d' tmpdir = local('/tmp/pytest-of-omero/pytest-20/testTargetInDifferentGroup_Dat0') capfd = <_pytest.capture.CaptureFixture object at 0x7f7a0c9901f0> @pytest.mark.broken(reason="needs omero.group setting") @pytest.mark.parametrize("container,filename,arg", target_fixtures) def testTargetInDifferentGroup(self, container, filename, arg, tmpdir, capfd): new_group = self.new_group(experimenters=[self.user]) self.sf.getAdminService().getEventContext() # Refresh target = eval("omero.model."+container+"I")() target.name = rstring('testTargetInDifferentGroup') target = self.update.saveAndReturnObject( target, {"omero.group": str(new_group.id.val)}) assert target.details.group.id.val == new_group.id.val fakefile = tmpdir.join(filename) fakefile.write('') self.args += [str(fakefile)] self.args += [arg, '%s' % target.id.val] # Invoke CLI import command and retrieve stdout/stderr > o, e = self.do_import(capfd) test/integration/clitest/test_import.py:1216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/integration/clitest/test_import.py:150: in do_import self.cli.invoke(self.args, strict=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/cli.py:1211: in invoke self.assertRC() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def assertRC(self): if self.rv != 0: > raise NonZeroReturnCode(self.rv, "assert failed") E omero.cli.NonZeroReturnCode: assert failed ../../../../.venv3/lib64/python3.9/site-packages/omero/cli.py:1200: NonZeroReturnCode ----------------------------- Captured stdout call ----------------------------- OOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE 2024-10-24 08:12:24,802 280 [ main] INFO ome.formats.importer.ImportConfig - OMERO.blitz Version: 5.7.5-SNAPSHOT 2024-10-24 08:12:24,820 298 [ main] INFO ome.formats.importer.ImportConfig - Bioformats version: 8.0.0-SNAPSHOT revision: 62d359b4bc191e66a0e75fbc407c2b440d35be57 date: 24 October 2024 2024-10-24 08:12:24,877 355 [ main] INFO formats.importer.cli.CommandLineImporter - Log levels -- Bio-Formats: ERROR OMERO.importer: INFO 2024-10-24 08:12:25,431 909 [ main] INFO ome.formats.importer.ImportCandidates - Depth: 4 Metadata Level: MINIMUM 2024-10-24 08:12:25,509 987 [ main] INFO ome.formats.importer.ImportCandidates - 1 file(s) parsed into 1 group(s) with 1 call(s) to setId in 75ms. (78ms total) [0 unknowns] 2024-10-24 08:12:25,551 1029 [ main] INFO ome.formats.OMEROMetadataStoreClient - Attempting initial SSL connection to localhost:14064 2024-10-24 08:12:26,290 1768 [ main] INFO ome.formats.OMEROMetadataStoreClient - Insecure connection requested, falling back 2024-10-24 08:12:26,530 2008 [ main] INFO ome.formats.OMEROMetadataStoreClient - Pinging session every 300s. 2024-10-24 08:12:26,538 2016 [ main] INFO ome.formats.OMEROMetadataStoreClient - Server: 5.6.3 2024-10-24 08:12:26,538 2016 [ main] INFO ome.formats.OMEROMetadataStoreClient - Client: 5.7.5-SNAPSHOT 2024-10-24 08:12:26,538 2016 [ main] INFO ome.formats.OMEROMetadataStoreClient - Java Version: 11.0.24 2024-10-24 08:12:26,538 2016 [ main] INFO ome.formats.OMEROMetadataStoreClient - OS Name: Linux 2024-10-24 08:12:26,538 2016 [ main] INFO ome.formats.OMEROMetadataStoreClient - OS Arch: amd64 2024-10-24 08:12:26,538 2016 [ main] INFO ome.formats.OMEROMetadataStoreClient - OS Version: 5.14.0-427.40.1.el9_4.x86_64 2024-10-24 08:12:36,837 12315 [ main] ERROR ome.system.UpgradeCheck - Error reading from url: http://upgrade.openmicroscopy.org.uk?version=5.7.5-SNAPSHOT;os.name=Linux;os.arch=amd64;os.version=5.14.0-427.40.1.el9_4.x86_64;java.runtime.version=11.0.24%2B8-LTS;java.vm.vendor=Red+Hat%2C+Inc. "connect timed out" 2024-10-24 08:12:37,073 12551 [ main] INFO ome.formats.importer.ImportConfig - Using import target: Dataset:1346 2024-10-24 08:12:37,084 12562 [2-thread-1] ERROR ome.formats.importer.ImportLibrary - Could not load target: ome.formats.importer.targets.ModelImportTarget@136c4438 2024-10-24 08:12:37,084 12562 [2-thread-1] ERROR ome.formats.importer.ImportLibrary - Error on import java.lang.RuntimeException: Failed to load target at ome.formats.importer.ImportLibrary$1.call(ImportLibrary.java:351) at ome.formats.importer.ImportLibrary$1.call(ImportLibrary.java:328) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: omero.SecurityViolation: null at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at java.base/java.lang.Class.newInstance(Class.java:584) at IceInternal.BasicStream.createUserException(BasicStream.java:2785) at IceInternal.BasicStream.access$300(BasicStream.java:14) at IceInternal.BasicStream$EncapsDecoder11.throwException(BasicStream.java:3620) at IceInternal.BasicStream.throwException(BasicStream.java:2291) at IceInternal.OutgoingAsync.throwUserException(OutgoingAsync.java:399) at omero.api.IQueryPrxHelper.end_get(IQueryPrxHelper.java:2000) at omero.api.IQueryPrxHelper.get(IQueryPrxHelper.java:1872) at omero.api.IQueryPrxHelper.get(IQueryPrxHelper.java:1859) at ome.formats.importer.targets.ModelImportTarget.load(ModelImportTarget.java:219) at ome.formats.importer.ImportLibrary$1.call(ImportLibrary.java:334) ... 7 common frames omitted 2024-10-24 08:12:37,086 12564 [2-thread-1] INFO ome.formats.importer.ImportLibrary - Exiting on error ==> Summary 0 files uploaded, 0 filesets created, 0 images imported, 0 errors in 0:00:00.248 ------------------------------ Captured log call ------------------------------- INFO omero.util.Resources:__init__.py:652 Starting INFO omero.util.Resources:__init__.py:669 Halted _ TestImport.testTargetInDifferentGroup[Screen-SPW&plates=1&plateRows=1&plateCols=1&fields=1&plateAcqs=1.fake--r] _ self = container = 'Screen' filename = 'SPW&plates=1&plateRows=1&plateCols=1&fields=1&plateAcqs=1.fake' arg = '-r' tmpdir = local('/tmp/pytest-of-omero/pytest-20/testTargetInDifferentGroup_Scr0') capfd = <_pytest.capture.CaptureFixture object at 0x7f7a0c9ab5b0> @pytest.mark.broken(reason="needs omero.group setting") @pytest.mark.parametrize("container,filename,arg", target_fixtures) def testTargetInDifferentGroup(self, container, filename, arg, tmpdir, capfd): new_group = self.new_group(experimenters=[self.user]) self.sf.getAdminService().getEventContext() # Refresh target = eval("omero.model."+container+"I")() target.name = rstring('testTargetInDifferentGroup') target = self.update.saveAndReturnObject( target, {"omero.group": str(new_group.id.val)}) assert target.details.group.id.val == new_group.id.val fakefile = tmpdir.join(filename) fakefile.write('') self.args += [str(fakefile)] self.args += [arg, '%s' % target.id.val] # Invoke CLI import command and retrieve stdout/stderr > o, e = self.do_import(capfd) test/integration/clitest/test_import.py:1216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/integration/clitest/test_import.py:150: in do_import self.cli.invoke(self.args, strict=True) ../../../../.venv3/lib64/python3.9/site-packages/omero/cli.py:1211: in invoke self.assertRC() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def assertRC(self): if self.rv != 0: > raise NonZeroReturnCode(self.rv, "assert failed") E omero.cli.NonZeroReturnCode: assert failed ../../../../.venv3/lib64/python3.9/site-packages/omero/cli.py:1200: NonZeroReturnCode ----------------------------- Captured stdout call ----------------------------- OOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE 2024-10-24 08:12:37,896 254 [ main] INFO ome.formats.importer.ImportConfig - OMERO.blitz Version: 5.7.5-SNAPSHOT 2024-10-24 08:12:37,912 270 [ main] INFO ome.formats.importer.ImportConfig - Bioformats version: 8.0.0-SNAPSHOT revision: 62d359b4bc191e66a0e75fbc407c2b440d35be57 date: 24 October 2024 2024-10-24 08:12:37,963 321 [ main] INFO formats.importer.cli.CommandLineImporter - Log levels -- Bio-Formats: ERROR OMERO.importer: INFO 2024-10-24 08:12:38,266 624 [ main] INFO ome.formats.importer.ImportCandidates - Depth: 4 Metadata Level: MINIMUM 2024-10-24 08:12:38,584 942 [ main] INFO ome.formats.importer.ImportCandidates - 1 file(s) parsed into 1 group(s) with 1 call(s) to setId in 315ms. (318ms total) [0 unknowns] 2024-10-24 08:12:38,619 977 [ main] INFO ome.formats.OMEROMetadataStoreClient - Attempting initial SSL connection to localhost:14064 2024-10-24 08:12:39,049 1407 [ main] INFO ome.formats.OMEROMetadataStoreClient - Insecure connection requested, falling back 2024-10-24 08:12:39,282 1640 [ main] INFO ome.formats.OMEROMetadataStoreClient - Pinging session every 300s. 2024-10-24 08:12:39,289 1647 [ main] INFO ome.formats.OMEROMetadataStoreClient - Server: 5.6.3 2024-10-24 08:12:39,289 1647 [ main] INFO ome.formats.OMEROMetadataStoreClient - Client: 5.7.5-SNAPSHOT 2024-10-24 08:12:39,289 1647 [ main] INFO ome.formats.OMEROMetadataStoreClient - Java Version: 11.0.24 2024-10-24 08:12:39,289 1647 [ main] INFO ome.formats.OMEROMetadataStoreClient - OS Name: Linux 2024-10-24 08:12:39,290 1648 [ main] INFO ome.formats.OMEROMetadataStoreClient - OS Arch: amd64 2024-10-24 08:12:39,290 1648 [ main] INFO ome.formats.OMEROMetadataStoreClient - OS Version: 5.14.0-427.40.1.el9_4.x86_64 2024-10-24 08:12:49,366 11724 [ main] ERROR ome.system.UpgradeCheck - Error reading from url: http://upgrade.openmicroscopy.org.uk?version=5.7.5-SNAPSHOT;os.name=Linux;os.arch=amd64;os.version=5.14.0-427.40.1.el9_4.x86_64;java.runtime.version=11.0.24%2B8-LTS;java.vm.vendor=Red+Hat%2C+Inc. "connect timed out" 2024-10-24 08:12:49,615 11973 [ main] INFO ome.formats.importer.ImportConfig - Using import target: Screen:232 2024-10-24 08:12:49,631 11989 [2-thread-1] ERROR ome.formats.importer.ImportLibrary - Could not load target: ome.formats.importer.targets.ModelImportTarget@21e5a652 2024-10-24 08:12:49,631 11989 [2-thread-1] ERROR ome.formats.importer.ImportLibrary - Error on import java.lang.RuntimeException: Failed to load target at ome.formats.importer.ImportLibrary$1.call(ImportLibrary.java:351) at ome.formats.importer.ImportLibrary$1.call(ImportLibrary.java:328) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: omero.SecurityViolation: null at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at java.base/java.lang.Class.newInstance(Class.java:584) at IceInternal.BasicStream.createUserException(BasicStream.java:2785) at IceInternal.BasicStream.access$300(BasicStream.java:14) at IceInternal.BasicStream$EncapsDecoder11.throwException(BasicStream.java:3620) at IceInternal.BasicStream.throwException(BasicStream.java:2291) at IceInternal.OutgoingAsync.throwUserException(OutgoingAsync.java:399) at omero.api.IQueryPrxHelper.end_get(IQueryPrxHelper.java:2000) at omero.api.IQueryPrxHelper.get(IQueryPrxHelper.java:1872) at omero.api.IQueryPrxHelper.get(IQueryPrxHelper.java:1859) at ome.formats.importer.targets.ModelImportTarget.load(ModelImportTarget.java:219) at ome.formats.importer.ImportLibrary$1.call(ImportLibrary.java:334) ... 7 common frames omitted 2024-10-24 08:12:49,633 11991 [2-thread-1] INFO ome.formats.importer.ImportLibrary - Exiting on error ==> Summary 0 files uploaded, 0 filesets created, 0 images imported, 0 errors in 0:00:00.267 ------------------------------ Captured log call ------------------------------- INFO omero.util.Resources:__init__.py:652 Starting INFO omero.util.Resources:__init__.py:669 Halted ________________________ TestHistory.testCreateHistory _________________________ self = gatewaywrapper = @pytest.mark.broken(ticket="11494") def testCreateHistory(self, gatewaywrapper): # Login as user... gatewaywrapper.doLogin(dbhelpers.USERS['history_test_user']) userId = gatewaywrapper.gateway.getEventContext().userId uuid = gatewaywrapper.gateway.getEventContext().sessionUuid default_groupId = gatewaywrapper.gateway.getEventContext().groupId start = int(round(time.time() * 1000)) - 1000 # Create Dataset in 'default' group update = gatewaywrapper.gateway.getUpdateService() new_ds = omero.model.DatasetI() dataset_name = "history_test_%s" % uuid new_ds.name = rstring(dataset_name) new_ds = update.saveAndReturnObject(new_ds) new_ds_Id = new_ds.id.val # As Admin, create a second group with this user & upload script gatewaywrapper.loginAsAdmin() gid = gatewaywrapper.gateway.createGroup( "history-test-%s" % uuid, member_Ids=[userId], perms=READWRITE) # login as User gatewaywrapper.doLogin(dbhelpers.USERS['history_test_user']) end = int(round(time.time() * 1000)) + 1000 self.searchHistory(gatewaywrapper.gateway, start, end) # switch user into new group switched = gatewaywrapper.gateway.c.sf.setSecurityContext( omero.model.ExperimenterGroupI(gid, False)) assert switched, "Failed to switch into new group" # Shouldn't be able to access Dataset... > self.searchHistory(gatewaywrapper.gateway, start, end) test/integration/gatewaytest/test_multi_group.py:96: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = gateway = start = 1729753970253, end = 1729753972398, dtype = 'Dataset' def searchHistory(self, gateway, start, end, dtype="Dataset"): tm = gateway.getTimelineService() count = tm.countByPeriod([dtype], rtime(int(start)), rtime(int(end)), None, gateway.SERVICE_OPTS) data = tm.getByPeriod([dtype], rtime(int(start)), rtime(int(end)), None, True, gateway.SERVICE_OPTS) logs = tm.getEventLogsByPeriod(rtime(start), rtime(end), None, gateway.SERVICE_OPTS) entityType = 'ome.model.containers.%s' % dtype filteredLogs = [{'id': i.entityId.val, 'action': i.action.val} for i in logs if i.entityType.val == entityType] typeCount = count[dtype] dataCount = len(data[dtype]) logCount = len(filteredLogs) assert typeCount == dataCount, \ "Period count should match number of objects" > assert logCount == dataCount, \ "Logs count should match number of objects" E AssertionError: Logs count should match number of objects E assert 1 == 0 test/integration/gatewaytest/test_multi_group.py:58: AssertionError ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=98a47330-3fa8-4ac3-9cbf-cc26e12186af) INFO omero.gateway:__init__.py:2243 created connection (uuid=b17acd38-84c6-44b2-95b2-5258b7b48949) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=5515b245-90df-4cc1-9c9d-4d73ec48f267) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=77cde3e0-b738-41c8-a0e9-49463c9e5b32) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=4c991a9f-de40-4c15-90f6-43d0ed3235b2) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=502fdaab-d285-4f9b-af38-aa505f51f31a) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=74b6c69a-6399-4e0e-97c7-963e2af0802a) WARNING omero.gateway:__init__.py:4810 ApiUsageException on to <5045a391-3c49-45a9-b761-63ed336f87fbomero.api.IAdmin> lookupExperimenter(('history_test_user',), {}) Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py", line 4830, in __call__ return self.f(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IAdmin_ice.py", line 824, in lookupExperimenter return _M_omero.api.IAdmin._op_lookupExperimenter.invoke(self, ((name, ), _ctx)) omero.ApiUsageException: exception ::omero::ApiUsageException { serverStackTrace = ome.conditions.ApiUsageException: No such experimenter: history_test_user at ome.logic.AdminImpl.lookupExperimenter(AdminImpl.java:325) at jdk.internal.reflect.GeneratedMethodAccessor631.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.lookupExperimenter(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor631.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.lookupExperimenter(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor3372.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.AdminI.lookupExperimenter_async(AdminI.java:256) at jdk.internal.reflect.GeneratedMethodAccessor3371.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy97.lookupExperimenter_async(Unknown Source) at omero.api._IAdminTie.lookupExperimenter_async(_IAdminTie.java:264) at omero.api._IAdminDisp.___lookupExperimenter(_IAdminDisp.java:924) at omero.api._IAdminDisp.__dispatch(_IAdminDisp.java:2345) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ApiUsageException message = No such experimenter: history_test_user } WARNING omero.gateway:__init__.py:4810 ApiUsageException on to <5045a391-3c49-45a9-b761-63ed336f87fbomero.api.IAdmin> lookupGroup(('rw_history',), {}) Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py", line 4830, in __call__ return self.f(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IAdmin_ice.py", line 937, in lookupGroup return _M_omero.api.IAdmin._op_lookupGroup.invoke(self, ((name, ), _ctx)) omero.ApiUsageException: exception ::omero::ApiUsageException { serverStackTrace = ome.conditions.ApiUsageException: No such group: rw_history at ome.logic.AdminImpl.lookupGroup(AdminImpl.java:366) at jdk.internal.reflect.GeneratedMethodAccessor563.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.lookupGroup(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor563.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.lookupGroup(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor589.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.AdminI.lookupGroup_async(AdminI.java:266) at jdk.internal.reflect.GeneratedMethodAccessor588.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy97.lookupGroup_async(Unknown Source) at omero.api._IAdminTie.lookupGroup_async(_IAdminTie.java:276) at omero.api._IAdminDisp.___lookupGroup(_IAdminDisp.java:990) at omero.api._IAdminDisp.__dispatch(_IAdminDisp.java:2353) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ApiUsageException message = No such group: rw_history } INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) ------------------------------ Captured log call ------------------------------- INFO omero.gateway:__init__.py:2243 created connection (uuid=2a7be226-a1f2-4431-8f5e-7b2a84350c33) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=4ebfe81c-612c-4d9d-b068-e4233fb32224) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=908d1fd1-06b9-4bca-b5d3-752a12d2ca42) --------------------------- Captured stderr teardown --------------------------- ** ---------------------------- Captured log teardown ----------------------------- INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=d43895cb-589e-4f85-a9c7-ff1ca648a0c4) INFO omero.gateway:__init__.py:2243 created connection (uuid=7cef1681-e9ff-421b-ad8e-3770132fa09a) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=a6ecd770-0988-4d19-837a-4a88cc6cadb3) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) ___________________________ TestScript.testRunScript ___________________________ self = gatewaywrapper = @pytest.mark.broken(ticket="11610") def testRunScript(self, gatewaywrapper): # Login as user... gatewaywrapper.doLogin(dbhelpers.USERS['script_test_user']) userId = gatewaywrapper.gateway.getEventContext().userId uuid = gatewaywrapper.gateway.getEventContext().sessionUuid default_groupId = gatewaywrapper.gateway.getEventContext().groupId # Create Dataset in 'default' group update = gatewaywrapper.gateway.getUpdateService() new_ds = omero.model.DatasetI() dataset_name = "script_test_%s" % uuid new_ds.name = rstring(dataset_name) new_ds = update.saveAndReturnObject(new_ds) new_ds_Id = new_ds.id.val # As Admin, create a second group with this user & upload script gatewaywrapper.loginAsAdmin() gid = gatewaywrapper.gateway.createGroup( "script-test-%s" % uuid, member_Ids=[userId], perms=READWRITE) SCRIPT = """if True: import omero.scripts import omero.rtypes client = omero.scripts.client("ticket8573", \ omero.scripts.Long("datasetId"), \ omero.scripts.String("datasetName", out=True)) ec = client.sf.getAdminService().getEventContext() gid = ec.groupId qs = client.sf.getQueryService() ds_Id = client.getInput("datasetId").getValue() print("Running test...") # generate stdout try: dataset = qs.find("Dataset", ds_Id) ds_Name = dataset.name.val print(ds_Name) except Exception: ds_Name = "Not Found" client.setOutput("gid", omero.rtypes.rlong(gid)) client.setOutput("datasetName", omero.rtypes.rstring(ds_Name)) """ svc = gatewaywrapper.gateway.getScriptService() scriptID = svc.uploadOfficialScript( "/test/ticket8573/%s" % uuid, SCRIPT) # switch user into new group gatewaywrapper.doLogin(dbhelpers.USERS['script_test_user']) switched = gatewaywrapper.gateway.c.sf.setSecurityContext( omero.model.ExperimenterGroupI(gid, False)) assert switched, "Failed to switch into new group" # Shouldn't be able to access Dataset... value = gatewaywrapper.gateway.getObject("Dataset", new_ds_Id) assert value is None gatewaywrapper.gateway.SERVICE_OPTS.setOmeroGroup( str(default_groupId)) value = gatewaywrapper.gateway.getObject("Dataset", new_ds_Id) assert value is not None # run script svc = gatewaywrapper.gateway.getScriptService() > process = svc.runScript(scriptID, {"datasetId": rlong(new_ds_Id)}, None, gatewaywrapper.gateway.SERVICE_OPTS) test/integration/gatewaytest/test_multi_group.py:188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:4833: in __call__ return self.handle_exception(e, *args, **kwargs) ../../../../.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:4830: in __call__ return self.f(*args, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = 5dc53f01-284c-437d-9b7e-caec06d5f787/93fc0a43-9c40-4dcd-a1cb-35d0024d4a4comero.api.IScript -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 scriptID = 5945 inputs = {'datasetId': object #0 (::omero::RLong) { _val = 1354 }} waitSecs = None _ctx = def runScript(self, scriptID, inputs, waitSecs, _ctx=None): > return _M_omero.api.IScript._op_runScript.invoke(self, ((scriptID, inputs, waitSecs), _ctx)) E omero.NoProcessorAvailable: exception ::omero::NoProcessorAvailable E { E serverStackTrace = E serverExceptionClass = E message = No processor available! [0 response(s)] E processorCount = 0 E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py:935: NoProcessorAvailable ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=8f99b03c-c1c8-43e8-b70b-2f5adcd87464) INFO omero.gateway:__init__.py:2243 created connection (uuid=4a31222b-4ff7-453b-9a7d-d5d52f9880bb) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=b5c5d1a2-185a-4fbd-859b-f2d2ca649429) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=9b390c56-cb0b-4356-a88a-395e4476aef6) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=a1b96777-00e8-4dcc-8f2d-5a928d1aa2fc) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=2e8539a0-9873-4561-a746-46cc53b5601d) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=3fefa7db-be61-4b8c-bc87-1c590331370b) WARNING omero.gateway:__init__.py:4810 ApiUsageException on to <5f326e87-9143-4773-bb48-c3b6ca8b5c16omero.api.IAdmin> lookupExperimenter(('script_test_user',), {}) Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py", line 4830, in __call__ return self.f(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IAdmin_ice.py", line 824, in lookupExperimenter return _M_omero.api.IAdmin._op_lookupExperimenter.invoke(self, ((name, ), _ctx)) omero.ApiUsageException: exception ::omero::ApiUsageException { serverStackTrace = ome.conditions.ApiUsageException: No such experimenter: script_test_user at ome.logic.AdminImpl.lookupExperimenter(AdminImpl.java:325) at jdk.internal.reflect.GeneratedMethodAccessor631.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.lookupExperimenter(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor631.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.lookupExperimenter(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor3372.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.AdminI.lookupExperimenter_async(AdminI.java:256) at jdk.internal.reflect.GeneratedMethodAccessor3371.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy97.lookupExperimenter_async(Unknown Source) at omero.api._IAdminTie.lookupExperimenter_async(_IAdminTie.java:264) at omero.api._IAdminDisp.___lookupExperimenter(_IAdminDisp.java:924) at omero.api._IAdminDisp.__dispatch(_IAdminDisp.java:2345) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ApiUsageException message = No such experimenter: script_test_user } WARNING omero.gateway:__init__.py:4810 ApiUsageException on to <5f326e87-9143-4773-bb48-c3b6ca8b5c16omero.api.IAdmin> lookupGroup(('rw_script',), {}) Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py", line 4830, in __call__ return self.f(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IAdmin_ice.py", line 937, in lookupGroup return _M_omero.api.IAdmin._op_lookupGroup.invoke(self, ((name, ), _ctx)) omero.ApiUsageException: exception ::omero::ApiUsageException { serverStackTrace = ome.conditions.ApiUsageException: No such group: rw_script at ome.logic.AdminImpl.lookupGroup(AdminImpl.java:366) at jdk.internal.reflect.GeneratedMethodAccessor563.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.lookupGroup(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor563.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.lookupGroup(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor589.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.AdminI.lookupGroup_async(AdminI.java:266) at jdk.internal.reflect.GeneratedMethodAccessor588.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy97.lookupGroup_async(Unknown Source) at omero.api._IAdminTie.lookupGroup_async(_IAdminTie.java:276) at omero.api._IAdminDisp.___lookupGroup(_IAdminDisp.java:990) at omero.api._IAdminDisp.__dispatch(_IAdminDisp.java:2353) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ApiUsageException message = No such group: rw_script } INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) ------------------------------ Captured log call ------------------------------- INFO omero.gateway:__init__.py:2243 created connection (uuid=65a18050-5f8f-4537-92bc-e799d39c5547) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=12330792-083f-425c-93e9-b943119e99b2) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=5dc53f01-284c-437d-9b7e-caec06d5f787) WARNING omero.gateway:__init__.py:4810 NoProcessorAvailable on to <93fc0a43-9c40-4dcd-a1cb-35d0024d4a4comero.api.IScript> runScript((5945, {'datasetId': object #0 (::omero::RLong) { _val = 1354 }}, None, ), {}) Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py", line 4830, in __call__ return self.f(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 935, in runScript return _M_omero.api.IScript._op_runScript.invoke(self, ((scriptID, inputs, waitSecs), _ctx)) omero.NoProcessorAvailable: exception ::omero::NoProcessorAvailable { serverStackTrace = serverExceptionClass = message = No processor available! [0 response(s)] processorCount = 0 } --------------------------- Captured stderr teardown --------------------------- ** ---------------------------- Captured log teardown ----------------------------- INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=5f4247a9-6d67-4c93-847a-74bd9530cb2a) INFO omero.gateway:__init__.py:2243 created connection (uuid=8817ea2c-602b-4e6c-a160-b1f1b036314d) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=7d6ab483-eb1c-42d5-87f2-96e3d47439c7) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) ___________________ TestPerformance.testListFileAnnotations ____________________ self = gatewaywrapper = @pytest.mark.broken(ticket="11494") def testListFileAnnotations(self, gatewaywrapper): """ testListFileAnnotations: test speed of getObjects('FileAnnotation') vv listFileAnnotations() """ gatewaywrapper.loginAsAuthor() updateService = gatewaywrapper.gateway.getUpdateService() def createFileAnnotation(name, ns): originalFile = omero.model.OriginalFileI() originalFile.setName(rstring(name)) originalFile.setPath(rstring(name)) originalFile.setSize(rlong(0)) originalFile.setHash(rstring("Foo")) originalFile = updateService.saveAndReturnObject(originalFile) fa = omero.model.FileAnnotationI() fa.setFile(originalFile) fa.setNs(rstring(ns)) fa = updateService.saveAndReturnObject(fa) return fa.id.val ns = "omero.gatewaytest.PerformanceTest.testListFileAnnotations" fileCount = 250 fileAnnIds = [createFileAnnotation("testListFileAnnotations%s" % i, ns) for i in range(fileCount)] # test speed of listFileAnnotations startTime = time.time() fileCount = 0 fileAnns = gatewaywrapper.gateway.listFileAnnotations(toInclude=[ns]) for fa in fileAnns: fa.getFileName() fileCount += 1 t1 = time.time() - startTime print("listFileAnnotations for %d files = %s secs" % (fileCount, t1)) # Typically 1.4 secs # test speed of getOjbects("Annotation") - lazy loading file names startTime = time.time() fileCount = 0 fileAnns = gatewaywrapper.gateway.getObjects( "FileAnnotation", attributes={'ns': ns}) for fa in fileAnns: fa.getFileName() fileCount += 1 t2 = time.time() - startTime print("getObjects, lazy loading file names for %d files = %s secs" % (fileCount, t2)) # Typically 2.8 secs # test speed of getOjbects("Annotation") - NO loading file names startTime = time.time() fileCount = 0 fileAnns = gatewaywrapper.gateway.getObjects( "FileAnnotation", attributes={'ns': ns}) for fa in fileAnns: fa.getId() fileCount += 1 t3 = time.time() - startTime print("getObjects, NO file names for %d files = %s secs" % (fileCount, t3)) # Typically 0.4 secs > assert t1 < t2, "Blitz listFileAnnotations() should be faster " \ "than getObjects('FileAnnotation')" E AssertionError: Blitz listFileAnnotations() should be faster than getObjects('FileAnnotation') E assert 0.3212752342224121 < 0.16168785095214844 test/integration/gatewaytest/test_performance.py:91: AssertionError ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=404e6440-3a07-4469-8b04-b6460ee21443) INFO omero.gateway:__init__.py:2243 created connection (uuid=b63f6848-ede2-4bc6-b227-3155aa7ae572) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=a6a17da0-a37f-4ac2-b2e1-7f1e9ea652ad) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=8293dbfb-c51f-414b-b96e-5225ef0a3217) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=e6c7a033-2df8-4819-842f-bd6191e8e0bf) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=9f2394fb-dcd0-405c-841e-08aacf0177ff) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) ----------------------------- Captured stdout call ----------------------------- listFileAnnotations for 250 files = 0.3212752342224121 secs getObjects, lazy loading file names for 250 files = 0.16168785095214844 secs getObjects, NO file names for 250 files = 0.15201234817504883 secs ------------------------------ Captured log call ------------------------------- INFO omero.gateway:__init__.py:2243 created connection (uuid=15a7cb8d-9491-4f70-b8fa-1118f5ba9e67) --------------------------- Captured stderr teardown --------------------------- ** ---------------------------- Captured log teardown ----------------------------- INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=a8327c10-befa-4902-a47c-56b3c1723446) INFO omero.gateway:__init__.py:2243 created connection (uuid=59075075-2813-42f4-93f9-301d94cada2b) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=3f9810e5-d36d-4d01-8e35-74745cc91ba4) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) _________________________ TestUser.testCrossGroupRead __________________________ self = gatewaywrapper = @pytest.mark.broken(ticket="11545") def testCrossGroupRead(self, gatewaywrapper): gatewaywrapper.loginAsAuthor() p = gatewaywrapper.getTestProject() assert str(p.getDetails().permissions)[4] == '-' d = p.getDetails() g = d.getGroup() gatewaywrapper.loginAsUser() gatewaywrapper.gateway.SERVICE_OPTS.setOmeroGroup('-1') > assert not g.getId() in \ gatewaywrapper.gateway.getEventContext().memberOfGroups E assert not 2719 in [2718, 1, 2730, 2719, 2780] E + where 2719 = getId() E + where getId = <_ExperimenterGroupWrapper id=2719>.getId E + and [2718, 1, 2730, 2719, 2780] = object #0 (::omero::sys::EventContext)\n{\n shareId = -1\n sessionId = 10312\n sessionUuid = 9b756529-dc91-4678-b52c-11c942c2d2aa\n userId = 3728\n userName = weblitz_test_user\n sudoerId = \n sudoerName = \n groupId = 2718\n groupName = weblitz_test_user_group\n isAdmin = False\n adminPrivileges = \n {\n }\n eventId = -1\n eventType = Internal\n memberOfGroups = \n {\n [0] = 2718\n [1] = 1\n [2] = 2730\n [3] = 2719\n [4] = 2780\n }\n leaderOfGroups = \n {\n }\n groupPermissions = object #1 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -56\n }\n}.memberOfGroups E + where object #0 (::omero::sys::EventContext)\n{\n shareId = -1\n sessionId = 10312\n sessionUuid = 9b756529-dc91-4678-b52c-11c942c2d2aa\n userId = 3728\n userName = weblitz_test_user\n sudoerId = \n sudoerName = \n groupId = 2718\n groupName = weblitz_test_user_group\n isAdmin = False\n adminPrivileges = \n {\n }\n eventId = -1\n eventType = Internal\n memberOfGroups = \n {\n [0] = 2718\n [1] = 1\n [2] = 2730\n [3] = 2719\n [4] = 2780\n }\n leaderOfGroups = \n {\n }\n groupPermissions = object #1 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -56\n }\n} = getEventContext() E + where getEventContext = .getEventContext E + where = .gateway test/integration/gatewaytest/test_user.py:127: AssertionError ------------------------------ Captured log setup ------------------------------ INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=a03f1047-adbd-4607-92a9-e06c17902217) INFO omero.gateway:__init__.py:2243 created connection (uuid=1a5adfda-827c-415e-9194-0d0dc132906b) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=98bb1b98-aa7d-42c8-81ec-0783646c5825) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=60f25801-3dc5-4b60-abec-e3081f8163f7) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=1b4ebd6b-d3c1-47c5-ac4d-2338352150ee) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=0e16902f-b999-4013-a672-b3d201f1089a) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) ------------------------------ Captured log call ------------------------------- INFO omero.gateway:__init__.py:2243 created connection (uuid=dd421369-3a2b-4f13-8f62-3d38ee852803) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=9b756529-dc91-4678-b52c-11c942c2d2aa) --------------------------- Captured stderr teardown --------------------------- ** ---------------------------- Captured log teardown ----------------------------- INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=fd26e80f-20dc-4ea4-b5f7-4a385b1c1f86) INFO omero.gateway:__init__.py:2243 created connection (uuid=8dae1d58-0a0b-4088-9996-d11e9c0d5a74) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) INFO omero.gateway:__init__.py:2243 created connection (uuid=e0d8ad95-95da-4516-bea2-b13e1b871fe6) INFO omero.gateway:__init__.py:1940 closed connection (uuid=None) _________________________ TestPing.testProcessCallback _________________________ self = @pytest.mark.broken(ticket="11494") def testProcessCallback(self): callback = CallbackI() id = self.client.getCommunicator().stringToIdentity(str(uuid.uuid4())) cb = self.client.getAdapter().add(callback, id) cb = omero.grid.ProcessCallbackPrx.uncheckedCast(cb) p = self._getProcessor() params = p.params() assert params.stdoutFormat process = p.execute(rmap({})) process.registerCallback(cb) self.assertSuccess(p, process) > assert len(callback.finish) > 0 E assert 0 > 0 E + where 0 = len([]) E + where [] = object #0 (::omero::grid::ProcessCallback)\n{\n}.finish test/integration/scriptstest/test_ping.py:214: AssertionError ______________________ TestScriptRepo.testGetGroupScripts ______________________ self = @pytest.mark.broken(ticket="11494") def testGetGroupScripts(self): scriptService = self.sf.getScriptService() client = self.new_client(self.group) sid = client.sf.getScriptService().uploadScript( "/test/otheruser.py", """if True: import omero, omero.scripts as OS OS.client("testGetGroupScripts") """) myGroupScripts = scriptService.getUserScripts( [omero.model.ExperimenterGroupI(self.group.id.val, False)]) > assert sid in [x.id.val for x in myGroupScripts] E assert 6201 in [] test/integration/scriptstest/test_repo.py:78: AssertionError _____________________________ TestTables.test2098 ______________________________ self = @pytest.mark.broken(ticket="11534") def test2098(self): """ Creates and downloads an HDF file and checks that its size and hash match whats in the db """ grid = self.client.sf.sharedResources() table = grid.newTable(1, "/test") assert table lc = columns.LongColumnI('lc', 'desc', [1]) file = None try: file = table.getOriginalFile() assert file table.initialize([lc]) table.addData([lc]) finally: # Not deleting since queried table.close() # Reload the file file = self.client.sf.getQueryService().get( "OriginalFile", file.id.val) # Check values > p = path.path(self.tmpfile()) E AttributeError: type object 'path' has no attribute 'path' test/integration/tablestest/test_service.py:154: AttributeError _________________________ TestTables.testReadOnlyFile __________________________ self = @pytest.mark.broken(ticket="unimplemented") def testReadOnlyFile(self): """ Create an HDF5 file on the server, and then mark it read-only. The server should still allow you to load & read that file. """ self.testBlankTable() # ofile > filename = self.unique_dir + "/file.txt" E AttributeError: 'TestTables' object has no attribute 'unique_dir' test/integration/tablestest/test_service.py:661: AttributeError ___________________________ TestTables.testReadEqual ___________________________ self = twoColumnFiveRowTable = 473bca6c-804b-4ae8-a75f-67988e0b80b1/Table-16c40e25-786d-4b28-a12e-e5e2b1128d6a -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 @pytest.mark.broken(reason="start=0,end=0 to be reviewed") def testReadEqual(self, twoColumnFiveRowTable): # start=0, end=0 has a special contract data = twoColumnFiveRowTable.read([0], 0, 0) assert 1 == len(data.columns) > assert [1] == data.columns[0].values E assert [1] == [1, 2, 3, 4, 5] E E Right contains 4 more items, first extra item: 2 E E Full diff: E [ E 1, E - 2, E - 3, E - 4, E - 5, E ] test/integration/tablestest/test_service.py:967: AssertionError ________________________ TestTables.testReadOutOfRange _________________________ self = twoColumnFiveRowTable = 473bca6c-804b-4ae8-a75f-67988e0b80b1/Table-22b94011-4a69-4b54-b389-94030e173e1d -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 @pytest.mark.broken(reason="need to be reviewed") def testReadOutOfRange(self, twoColumnFiveRowTable): # [1, 2, 3, 4, 5][-1:5] = [5] data = twoColumnFiveRowTable.read([0], -1, 5) assert 1 == len(data.columns) assert [5] == data.columns[0].values > assert [4] == data.rowNumbers E assert [4] == [-1, 0, 1, 2, 3, 4] E E At index 0 diff: 4 != -1 E Right contains 5 more items, first extra item: 0 E E Full diff: E [ E - -1, E - 0, E - 1, E - 2, E - 3, E 4, E ] test/integration/tablestest/test_service.py:1015: AssertionError ____________________ TestAdmin.testChangePasswordWhenUnset _____________________ self = @pytest.mark.broken(reason="Empty password disabled by config", ticket="3201") def testChangePasswordWhenUnset(self): """ Shows that it's possible to use the changePasswordWithOldPassword when previously no password was set. See ticket:3201 """ client = self.new_client() admin = client.sf.getAdminService() # By setting the user's password to the empty string # any password will be allowed as the old password admin.changePassword(rstring("")) > admin.changePasswordWithOldPassword(rstring("IGNORED"), rstring("ome")) test/integration/test_admin.py:118: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = 64797a2a-2e21-402c-82bc-1fa24e24cc2a/ef79ab4b-3c80-415a-8d92-dfdf99a84758omero.api.IAdmin -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 oldPassword = object #0 (::omero::RString) { _val = IGNORED } newPassword = object #0 (::omero::RString) { _val = ome }, _ctx = None def changePasswordWithOldPassword(self, oldPassword, newPassword, _ctx=None): > return _M_omero.api.IAdmin._op_changePasswordWithOldPassword.invoke(self, ((oldPassword, newPassword), _ctx)) E omero.SecurityViolation: exception ::omero::SecurityViolation E { E serverStackTrace = ome.conditions.SecurityViolation: Old password is invalid E at ome.logic.AdminImpl.changePasswordWithOldPassword(AdminImpl.java:1253) E at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) E at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.EventHandler.invoke(EventHandler.java:154) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) E at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) E at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy96.changePasswordWithOldPassword(Unknown Source) E at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) E at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy96.changePasswordWithOldPassword(Unknown Source) E at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) E at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) E at ome.services.throttling.Callback.run(Callback.java:56) E at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) E at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) E at ome.services.blitz.impl.AdminI.changePasswordWithOldPassword_async(AdminI.java:144) E at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) E at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at omero.cmd.CallContext.invoke(CallContext.java:85) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy97.changePasswordWithOldPassword_async(Unknown Source) E at omero.api._IAdminTie.changePasswordWithOldPassword_async(_IAdminTie.java:112) E at omero.api._IAdminDisp.___changePasswordWithOldPassword(_IAdminDisp.java:1977) E at omero.api._IAdminDisp.__dispatch(_IAdminDisp.java:2229) E at IceInternal.Incoming.invoke(Incoming.java:221) E at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) E at Ice.ConnectionI.dispatch(ConnectionI.java:1145) E at Ice.ConnectionI.message(ConnectionI.java:1056) E at IceInternal.ThreadPool.run(ThreadPool.java:395) E at IceInternal.ThreadPool.access$300(ThreadPool.java:12) E at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) E at java.base/java.lang.Thread.run(Thread.java:829) E E serverExceptionClass = ome.conditions.SecurityViolation E message = Old password is invalid E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_IAdmin_ice.py:2460: SecurityViolation ______________________________ TestAdmin.test9193 ______________________________ self = @pytest.mark.broken(reason="Is this test still valid?", ticket="11465") def test9193(self): # Test the removal of removing users # from a group when the group in question # may be their last (i.e. default) group g = self.new_group() u = self.new_user(group=g) # Test removing the default group > self.remove_experimenters(g, [u]) test/integration/test_admin.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:247: in remove_experimenters admin.removeGroups(user, [group]) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = 0e3eb9dc-a60a-41d0-8750-790a16d403fb/03052890-9845-4570-9952-7276a2e3275eomero.api.IAdmin -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 user = object #0 (::omero::model::Experimenter) { _id = object #1 (::omero::RLong) { _val = 4726 } _d...annotationLinksSeq = { } _annotationLinksLoaded = False _annotationLinksCountPerOwner = { } } groups = [object #0 (::omero::model::ExperimenterGroup) { _id = object #1 (::omero::RLong) { _val = 3667 } ...{ } _annotationLinksLoaded = False _annotationLinksCountPerOwner = { } _description = }] _ctx = None def removeGroups(self, user, groups, _ctx=None): > return _M_omero.api.IAdmin._op_removeGroups.invoke(self, ((user, groups), _ctx)) E omero.ValidationException: exception ::omero::ValidationException E { E serverStackTrace = ome.conditions.ValidationException: experimenter cannot be a member of only the 'user' group, a different default group is also required E at ome.logic.AdminImpl.removeGroups(AdminImpl.java:813) E at jdk.internal.reflect.GeneratedMethodAccessor627.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.EventHandler.invoke(EventHandler.java:154) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) E at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) E at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy96.removeGroups(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor627.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy96.removeGroups(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor3269.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) E at ome.services.throttling.Callback.run(Callback.java:56) E at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) E at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) E at ome.services.blitz.impl.AdminI.removeGroups_async(AdminI.java:318) E at jdk.internal.reflect.GeneratedMethodAccessor3268.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at omero.cmd.CallContext.invoke(CallContext.java:85) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy97.removeGroups_async(Unknown Source) E at omero.api._IAdminTie.removeGroups_async(_IAdminTie.java:312) E at omero.api._IAdminDisp.___removeGroups(_IAdminDisp.java:1621) E at omero.api._IAdminDisp.__dispatch(_IAdminDisp.java:2377) E at IceInternal.Incoming.invoke(Incoming.java:221) E at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) E at Ice.ConnectionI.dispatch(ConnectionI.java:1145) E at Ice.ConnectionI.message(ConnectionI.java:1056) E at IceInternal.ThreadPool.run(ThreadPool.java:395) E at IceInternal.ThreadPool.access$300(ThreadPool.java:12) E at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) E at java.base/java.lang.Thread.run(Thread.java:829) E E serverExceptionClass = ome.conditions.ValidationException E message = experimenter cannot be a member of only the 'user' group, a different default group is also required E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_IAdmin_ice.py:2011: ValidationException _________________ TestFiles.testUploadDifferentSizeTicket2337 __________________ self = @pytest.mark.broken(ticket="11610") def testUploadDifferentSizeTicket2337(self): uploaded = tmpfile() ofile = self.client.upload(str(uploaded), type="text/plain") uploaded.write_lines(["abc", "def"]) # Shorten > ofile = self.client.upload( str(uploaded), type="text/plain", ofile=ofile) test/integration/test_files.py:62: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/clients.py:878: in upload ofile = up.saveAndReturnObject(ofile) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = d07b75ee-9e98-4477-a1f4-bcd645352499/9c40b43c-4950-4032-af4f-df3170afb9b3omero.api.IUpdate -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 obj = object #0 (::omero::model::OriginalFile) { _id = object #1 (::omero::RLong) { _val = 6206 } _d...ksCountPerOwner = { } _name = object #60 (::omero::RString) { _val = omerok2jfbmsa.tmp } } _ctx = None def saveAndReturnObject(self, obj, _ctx=None): > return _M_omero.api.IUpdate._op_saveAndReturnObject.invoke(self, ((obj, ), _ctx)) E omero.OptimisticLockException: exception ::omero::OptimisticLockException E { E serverStackTrace = ome.conditions.OptimisticLockException: You are not authorized to change the update event for ome.model.core.OriginalFile:Id_6206 from ome.model.meta.Event:Id_124677 to ome.model.meta.Event:Id_124676 E You may need to reload the object before continuing. E at ome.security.basic.OmeroInterceptor.managedEvent(OmeroInterceptor.java:1201) E at ome.security.basic.OmeroInterceptor.checkManagedDetails(OmeroInterceptor.java:963) E at ome.security.basic.OmeroInterceptor.resetDetails(OmeroInterceptor.java:465) E at ome.security.basic.OmeroInterceptor.onFlushDirty(OmeroInterceptor.java:239) E at org.hibernate.event.def.DefaultFlushEntityEventListener.invokeInterceptor(DefaultFlushEntityEventListener.java:372) E at org.hibernate.event.def.DefaultFlushEntityEventListener.handleInterception(DefaultFlushEntityEventListener.java:349) E at org.hibernate.event.def.DefaultFlushEntityEventListener.scheduleUpdate(DefaultFlushEntityEventListener.java:287) E at org.hibernate.event.def.DefaultFlushEntityEventListener.onFlushEntity(DefaultFlushEntityEventListener.java:155) E at org.hibernate.event.def.AbstractFlushingEventListener.flushEntities(AbstractFlushingEventListener.java:219) E at org.hibernate.event.def.AbstractFlushingEventListener.flushEverythingToExecutions(AbstractFlushingEventListener.java:99) E at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:50) E at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1216) E at ome.logic.UpdateImpl.afterUpdate(UpdateImpl.java:342) E at ome.logic.UpdateImpl.doAction(UpdateImpl.java:358) E at ome.logic.UpdateImpl.doAction(UpdateImpl.java:349) E at ome.logic.UpdateImpl.saveAndReturnObject(UpdateImpl.java:135) E at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.EventHandler.invoke(EventHandler.java:154) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) E at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) E at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor683.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) E at ome.services.throttling.Callback.run(Callback.java:56) E at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) E at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) E at ome.services.blitz.impl.UpdateI.saveAndReturnObject_async(UpdateI.java:62) E at jdk.internal.reflect.GeneratedMethodAccessor682.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at omero.cmd.CallContext.invoke(CallContext.java:85) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy102.saveAndReturnObject_async(Unknown Source) E at omero.api._IUpdateTie.saveAndReturnObject_async(_IUpdateTie.java:92) E at omero.api._IUpdateDisp.___saveAndReturnObject(_IUpdateDisp.java:229) E at omero.api._IUpdateDisp.__dispatch(_IUpdateDisp.java:423) E at IceInternal.Incoming.invoke(Incoming.java:221) E at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) E at Ice.ConnectionI.dispatch(ConnectionI.java:1145) E at Ice.ConnectionI.message(ConnectionI.java:1056) E at IceInternal.ThreadPool.run(ThreadPool.java:395) E at IceInternal.ThreadPool.access$300(ThreadPool.java:12) E at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) E at java.base/java.lang.Thread.run(Thread.java:829) E E serverExceptionClass = ome.conditions.OptimisticLockException E message = You are not authorized to change the update event for ome.model.core.OriginalFile:Id_6206 from ome.model.meta.Event:Id_124677 to ome.model.meta.Event:Id_124676 E You may need to reload the object before continuing. E backOff = 0 E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_IUpdate_ice.py:163: OptimisticLockException _____________________________ TestIShare.test1172 ______________________________ self = @pytest.mark.broken(reason="shares are image-centric for now") def test1172(self): uuid = self.root.sf.getAdminService().getEventContext().sessionUuid share = self.root.sf.getShareService() query = self.root.sf.getQueryService() # create user client_share1, user1 = self.new_client_and_user() # create dataset with image ds = self.make_dataset(name="dataset-%s" % uuid, client=self.root) img = self.new_image(name='test-img in dataset-%s' % uuid) self.link(ds, img, client=self.root) items = list() p = omero.sys.Parameters() p.map = {"oid": ds.id} sql = ( "select ds from Dataset ds " "join fetch ds.details.owner " "join fetch ds.details.group " "left outer join fetch ds.imageLinks dil " "left outer join fetch dil.child i " "where ds.id=:oid") items.extend(query.findAllByQuery(sql, p)) assert 1 == len(items) # members p.map["eid"] = rlong(user1.id.val) sql = ("select e from Experimenter e " "where e.id =:eid order by e.omeName") ms = query.findAllByQuery(sql, p) sid = share.createShare(("test-share-%s" % uuid), rtime(int(time.time() * 1000 + 86400)), items, ms, [], True) # USER RETRIEVAL # login as user1 share1 = client_share1.sf.getShareService() query1 = client_share1.sf.getQueryService() content = share1.getContents(sid) # Content now contains just the dataset with nothing loaded assert 1 == len(content) # get shared dataset and image when share is activated share1.activate(sid) # retrieve dataset p = omero.sys.Parameters() p.map = {"ids": rlist([ds.id])} sql = ( "select ds from Dataset ds " "join fetch ds.details.owner " "join fetch ds.details.group " "left outer join fetch ds.imageLinks dil " "left outer join fetch dil.child i " "where ds.id in (:ids) order by ds.name") try: res1 = query1.findAllByQuery(sql, p) assert False, "This should throw an exception" except Exception: pass # Now we add all the other elements to the share to prevent # the security violation # # Not working imgs = cntar.getImages("Dataset",[ds.id.val], None) img = query.findByQuery( "select i from Image i join fetch i.datasetLinks dil " "join dil.parent d where d.id = %s " % ds.id.val, None) assert img share.addObject(sid, img) share.addObjects(sid, img.copyDatasetLinks()) assert 3 == len(share.getContents(sid)) # And try again to load them share1.activate(sid) res1 = query1.findAllByQuery(sql, p) > assert len(res1) == 1 E assert 0 == 1 E + where 0 = len([]) test/integration/test_ishare.py:381: AssertionError ____________________________ TestITimeline.test1225 ____________________________ self = @pytest.mark.broken(ticket="1225") def test1225(self): uuid = self.root.sf.getAdminService().getEventContext().sessionUuid update = self.root.sf.getUpdateService() timeline = self.root.sf.getTimelineService() query = self.root.sf.getQueryService() # create dataset to_save = list() for i in range(0, 10): to_save.append(self.new_dataset(name="ds-%i-%s" % (i, uuid))) dss = update.saveAndReturnArray(to_save) # create tag for i in range(0, 10): ds1 = query.get("Dataset", dss[i].id.val) ann = omero.model.TagAnnotationI() ann.textValue = rstring('tag-%i-%s' % (i, uuid)) ann.setDescription(rstring('desc-%i-%s' % (i, uuid))) t_ann = omero.model.DatasetAnnotationLinkI() t_ann.setParent(ds1) t_ann.setChild(ann) update.saveObject(t_ann) p = omero.sys.Parameters() p.map = {} f = omero.sys.Filter() f.ownerId = rlong(0) f.limit = rint(10) p.theFilter = f M = timeline.getMostRecentAnnotationLinks tagids = set([e.child.id.val for e in M(None, ['TagAnnotation'], None, p)]) assert len(tagids) == 10 # And under #9609 tagids = set([e.child.id.val for e in M(None, ['TagAnnotation'], None, p, {"omero.group": "-1"})]) > assert len(tagids) == 10 E assert 6 == 10 E + where 6 = len({1908, 1944, 1995, 9450, 13072, 41480}) test/integration/test_itimeline.py:244: AssertionError ___________________________ TestPermissions.test3136 ___________________________ self = @pytest.mark.broken(ticket="11494") def test3136(self): """ Calls to updateGroup were taking too long because the default value of permissions returned by the server was triggering a full changePermissions event. """ admin = self.root.sf.getAdminService() group = self.new_group(perms="rw----") # Change the name but not the permissions group.name = rstring(self.uuid()) elapsed1, rv = self.timeit(admin.updateGroup, group) # Now change the name and the permissions group.name = rstring(self.uuid()) group.details.permissions = omero.model.PermissionsI("rwr---") elapsed2, rv = self.timeit(admin.updateGroup, group) # Locally this test always fails as the two times are # the same order of magnitude. This may be an indication that # the relevant ticket: # https://trac.openmicroscopy.org/ome/ticket/3136 # is still valid. Does the ticket need re-opening # or does the test condition need relaxing? > assert elapsed1 < (0.1 * elapsed2), \ "elapsed1=%s, elapsed2=%s" % (elapsed1, elapsed2) E AssertionError: elapsed1=0.03419137001037598, elapsed2=0.01781940460205078 E assert 0.03419137001037598 < (0.1 * 0.01781940460205078) test/integration/test_permissions.py:488: AssertionError __________________ TestPermissions.testSaveWithNegOneExplicit __________________ self = @pytest.mark.broken(ticket="11374") def testSaveWithNegOneExplicit(self): # Get a user and services client, user = self.new_client_and_user() # Create a new object with an explicit group admin = client.sf.getAdminService() ec = admin.getEventContext() grp = omero.model.ExperimenterGroupI(ec.groupId, False) tag = omero.model.TagAnnotationI() tag.details.group = grp # Now try to save it in the -1 context update = client.sf.getUpdateService() all_context = {"omero.group": "-1"} > update.saveAndReturnObject(tag, all_context) test/integration/test_permissions.py:586: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = c4ae7add-91bd-41f8-81e8-c73cb3cc9fb5/9659cfad-c197-4483-99ce-c967234fa61bomero.api.IUpdate -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 obj = object #0 (::omero::model::TagAnnotation) { _id = _details = object #1 (::omero::model::Details) { ...nksSeq = { } _annotationLinksLoaded = True _annotationLinksCountPerOwner = {} _textValue = } _ctx = {'omero.group': '-1'} def saveAndReturnObject(self, obj, _ctx=None): > return _M_omero.api.IUpdate._op_saveAndReturnObject.invoke(self, ((obj, ), _ctx)) E omero.ApiUsageException: exception ::omero::ApiUsageException E { E serverStackTrace = ome.conditions.ApiUsageException: No valid permissions available! DUMMY permissions are not intended for copying. Make sure that you have not passed omero.group=-1 for a save without context E at ome.model.internal.Permissions.(Permissions.java:164) E at ome.security.basic.CurrentDetails.createDetails(CurrentDetails.java:439) E at ome.security.basic.OmeroInterceptor.newTransientDetails(OmeroInterceptor.java:700) E at ome.security.basic.OmeroInterceptor.onSave(OmeroInterceptor.java:187) E at org.hibernate.event.def.AbstractSaveEventListener.substituteValuesIfNecessary(AbstractSaveEventListener.java:413) E at org.hibernate.event.def.AbstractSaveEventListener.performSaveOrReplicate(AbstractSaveEventListener.java:292) E at org.hibernate.event.def.AbstractSaveEventListener.performSave(AbstractSaveEventListener.java:203) E at org.hibernate.event.def.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:143) E at org.hibernate.event.def.DefaultMergeEventListener.saveTransientEntity(DefaultMergeEventListener.java:415) E at org.hibernate.event.def.DefaultMergeEventListener.mergeTransientEntity(DefaultMergeEventListener.java:341) E at org.hibernate.event.def.DefaultMergeEventListener.entityIsTransient(DefaultMergeEventListener.java:303) E at org.springframework.orm.hibernate3.support.IdTransferringMergeEventListener.entityIsTransient(IdTransferringMergeEventListener.java:62) E at ome.security.basic.MergeEventListener.entityIsTransient(MergeEventListener.java:154) E at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:258) E at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:87) E at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:84) E at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:73) E at org.hibernate.impl.SessionImpl.fireMerge(SessionImpl.java:867) E at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:851) E at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:855) E at ome.logic.UpdateImpl.internalMerge(UpdateImpl.java:313) E at ome.logic.UpdateImpl$2.run(UpdateImpl.java:138) E at ome.logic.UpdateImpl$2.run(UpdateImpl.java:135) E at ome.logic.UpdateImpl.doAction(UpdateImpl.java:357) E at ome.logic.UpdateImpl.doAction(UpdateImpl.java:349) E at ome.logic.UpdateImpl.saveAndReturnObject(UpdateImpl.java:135) E at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.EventHandler.invoke(EventHandler.java:154) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) E at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) E at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor683.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) E at ome.services.throttling.Callback.run(Callback.java:56) E at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) E at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) E at ome.services.blitz.impl.UpdateI.saveAndReturnObject_async(UpdateI.java:62) E at jdk.internal.reflect.GeneratedMethodAccessor682.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at omero.cmd.CallContext.invoke(CallContext.java:85) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy102.saveAndReturnObject_async(Unknown Source) E at omero.api._IUpdateTie.saveAndReturnObject_async(_IUpdateTie.java:92) E at omero.api._IUpdateDisp.___saveAndReturnObject(_IUpdateDisp.java:229) E at omero.api._IUpdateDisp.__dispatch(_IUpdateDisp.java:423) E at IceInternal.Incoming.invoke(Incoming.java:221) E at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) E at Ice.ConnectionI.dispatch(ConnectionI.java:1145) E at Ice.ConnectionI.message(ConnectionI.java:1056) E at IceInternal.ThreadPool.run(ThreadPool.java:395) E at IceInternal.ThreadPool.access$300(ThreadPool.java:12) E at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) E at java.base/java.lang.Thread.run(Thread.java:829) E E serverExceptionClass = ome.conditions.ApiUsageException E message = No valid permissions available! DUMMY permissions are not intended for copying. Make sure that you have not passed omero.group=-1 for a save without context E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_IUpdate_ice.py:163: ApiUsageException ____________________ TestPermissions.testSaveWithNegBadLink ____________________ self = @pytest.mark.broken(ticket="11374") def testSaveWithNegBadLink(self): # ticket:8194 # Get a user and services client, user = self.new_client_and_user() admin = client.sf.getAdminService() group1 = admin.getGroup(admin.getEventContext().groupId) group2 = self.new_group(experimenters=[user]) for x in (group1, group2): x.unload() admin.getEventContext() # Refresh # Create a new object with a bad link image = self.new_image() image.details.group = group1 tag = omero.model.TagAnnotationI() tag.details.group = group2 link = image.linkAnnotation(tag) link.details.group = group2 # Now try to save it in the -1 context update = client.sf.getUpdateService() all_context = {"omero.group": "-1"} # Bad links should be detected and # a security violation raised. with pytest.raises(omero.GroupSecurityViolation): > update.saveAndReturnObject(image, all_context) test/integration/test_permissions.py:635: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = a57b0b50-8d9c-41c8-9c35-bff42ea83138/1c288436-cc49-4604-9bbe-cafc6daed01eomero.api.IUpdate -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 obj = object #0 (::omero::model::Image) { _id = _details = object #1 (::omero::model::Details) { _...bb6100-4aab-4b18-bc2d-b9dbf2ce12d2 } _description = object #14 (::omero::RString) { _val = } } _ctx = {'omero.group': '-1'} def saveAndReturnObject(self, obj, _ctx=None): > return _M_omero.api.IUpdate._op_saveAndReturnObject.invoke(self, ((obj, ), _ctx)) E omero.ApiUsageException: exception ::omero::ApiUsageException E { E serverStackTrace = ome.conditions.ApiUsageException: No valid permissions available! DUMMY permissions are not intended for copying. Make sure that you have not passed omero.group=-1 for a save without context E at ome.model.internal.Permissions.(Permissions.java:164) E at ome.security.basic.CurrentDetails.createDetails(CurrentDetails.java:439) E at ome.security.basic.OmeroInterceptor.newTransientDetails(OmeroInterceptor.java:700) E at ome.security.basic.OmeroInterceptor.onSave(OmeroInterceptor.java:187) E at org.hibernate.event.def.AbstractSaveEventListener.substituteValuesIfNecessary(AbstractSaveEventListener.java:413) E at org.hibernate.event.def.AbstractSaveEventListener.performSaveOrReplicate(AbstractSaveEventListener.java:292) E at org.hibernate.event.def.AbstractSaveEventListener.performSave(AbstractSaveEventListener.java:203) E at org.hibernate.event.def.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:143) E at org.hibernate.event.def.DefaultMergeEventListener.saveTransientEntity(DefaultMergeEventListener.java:415) E at org.hibernate.event.def.DefaultMergeEventListener.mergeTransientEntity(DefaultMergeEventListener.java:341) E at org.hibernate.event.def.DefaultMergeEventListener.entityIsTransient(DefaultMergeEventListener.java:303) E at org.springframework.orm.hibernate3.support.IdTransferringMergeEventListener.entityIsTransient(IdTransferringMergeEventListener.java:62) E at ome.security.basic.MergeEventListener.entityIsTransient(MergeEventListener.java:154) E at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:258) E at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:87) E at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:84) E at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:73) E at org.hibernate.impl.SessionImpl.fireMerge(SessionImpl.java:867) E at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:851) E at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:855) E at ome.logic.UpdateImpl.internalMerge(UpdateImpl.java:313) E at ome.logic.UpdateImpl$2.run(UpdateImpl.java:138) E at ome.logic.UpdateImpl$2.run(UpdateImpl.java:135) E at ome.logic.UpdateImpl.doAction(UpdateImpl.java:357) E at ome.logic.UpdateImpl.doAction(UpdateImpl.java:349) E at ome.logic.UpdateImpl.saveAndReturnObject(UpdateImpl.java:135) E at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.EventHandler.invoke(EventHandler.java:154) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) E at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) E at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor683.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) E at ome.services.throttling.Callback.run(Callback.java:56) E at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) E at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) E at ome.services.blitz.impl.UpdateI.saveAndReturnObject_async(UpdateI.java:62) E at jdk.internal.reflect.GeneratedMethodAccessor682.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at omero.cmd.CallContext.invoke(CallContext.java:85) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy102.saveAndReturnObject_async(Unknown Source) E at omero.api._IUpdateTie.saveAndReturnObject_async(_IUpdateTie.java:92) E at omero.api._IUpdateDisp.___saveAndReturnObject(_IUpdateDisp.java:229) E at omero.api._IUpdateDisp.__dispatch(_IUpdateDisp.java:423) E at IceInternal.Incoming.invoke(Incoming.java:221) E at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) E at Ice.ConnectionI.dispatch(ConnectionI.java:1145) E at Ice.ConnectionI.message(ConnectionI.java:1056) E at IceInternal.ThreadPool.run(ThreadPool.java:395) E at IceInternal.ThreadPool.access$300(ThreadPool.java:12) E at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) E at java.base/java.lang.Thread.run(Thread.java:829) E E serverExceptionClass = ome.conditions.ApiUsageException E message = No valid permissions available! DUMMY permissions are not intended for copying. Make sure that you have not passed omero.group=-1 for a save without context E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_IUpdate_ice.py:163: ApiUsageException _______________________ TestPermissions.testSaveBadLink ________________________ self = @pytest.mark.broken(ticket="11375") def testSaveBadLink(self): # Get a user and services client, user = self.new_client_and_user() admin = client.sf.getAdminService() group1 = admin.getGroup(admin.getEventContext().groupId) group2 = self.new_group(experimenters=[user]) for x in (group1, group2): x.unload() admin.getEventContext() # Refresh # Create a new object with a bad link image = self.new_image() image.details.group = group1 tag = omero.model.TagAnnotationI() tag.details.group = group2 link = image.linkAnnotation(tag) link.details.group = group2 # Now try to save it update = client.sf.getUpdateService() # Bad links should be detected and # a security violation raised. with pytest.raises(omero.GroupSecurityViolation): > update.saveAndReturnObject(image) test/integration/test_permissions.py:664: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = 2954c7cd-6edf-4fdc-afbf-2c148f070228/b03756b4-2c51-4a20-909d-50b2d7477d58omero.api.IUpdate -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 obj = object #0 (::omero::model::Image) { _id = _details = object #1 (::omero::model::Details) { _...5d1f37-7d75-41dd-8fae-ef642b461aa0 } _description = object #14 (::omero::RString) { _val = } } _ctx = None def saveAndReturnObject(self, obj, _ctx=None): > return _M_omero.api.IUpdate._op_saveAndReturnObject.invoke(self, ((obj, ), _ctx)) E omero.SecurityViolation: exception ::omero::SecurityViolation E { E serverStackTrace = ome.conditions.SecurityViolation: You are not authorized to set the ExperimenterGroup for ome.model.annotations.TagAnnotation:Id_41635 to ome.model.meta.ExperimenterGroup:Id_3678 E at ome.security.basic.OmeroInterceptor.newTransientDetails(OmeroInterceptor.java:785) E at ome.security.basic.OmeroInterceptor.onSave(OmeroInterceptor.java:187) E at org.hibernate.event.def.AbstractSaveEventListener.substituteValuesIfNecessary(AbstractSaveEventListener.java:413) E at org.hibernate.event.def.AbstractSaveEventListener.performSaveOrReplicate(AbstractSaveEventListener.java:292) E at org.hibernate.event.def.AbstractSaveEventListener.performSave(AbstractSaveEventListener.java:203) E at org.hibernate.event.def.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:143) E at org.hibernate.event.def.DefaultMergeEventListener.saveTransientEntity(DefaultMergeEventListener.java:415) E at org.hibernate.event.def.DefaultMergeEventListener.mergeTransientEntity(DefaultMergeEventListener.java:341) E at org.hibernate.event.def.DefaultMergeEventListener.entityIsTransient(DefaultMergeEventListener.java:303) E at org.springframework.orm.hibernate3.support.IdTransferringMergeEventListener.entityIsTransient(IdTransferringMergeEventListener.java:62) E at ome.security.basic.MergeEventListener.entityIsTransient(MergeEventListener.java:154) E at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:258) E at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:87) E at org.hibernate.impl.SessionImpl.fireMerge(SessionImpl.java:877) E at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:859) E at org.hibernate.engine.CascadingAction$6.cascade(CascadingAction.java:279) E at org.hibernate.engine.Cascade.cascadeToOne(Cascade.java:392) E at org.hibernate.engine.Cascade.cascadeAssociation(Cascade.java:335) E at org.hibernate.engine.Cascade.cascadeProperty(Cascade.java:204) E at org.hibernate.engine.Cascade.cascade(Cascade.java:161) E at org.hibernate.event.def.AbstractSaveEventListener.cascadeBeforeSave(AbstractSaveEventListener.java:450) E at org.hibernate.event.def.DefaultMergeEventListener.mergeTransientEntity(DefaultMergeEventListener.java:336) E at org.hibernate.event.def.DefaultMergeEventListener.entityIsTransient(DefaultMergeEventListener.java:303) E at org.springframework.orm.hibernate3.support.IdTransferringMergeEventListener.entityIsTransient(IdTransferringMergeEventListener.java:62) E at ome.security.basic.MergeEventListener.entityIsTransient(MergeEventListener.java:154) E at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:258) E at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:87) E at org.hibernate.impl.SessionImpl.fireMerge(SessionImpl.java:877) E at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:859) E at org.hibernate.engine.CascadingAction$6.cascade(CascadingAction.java:279) E at org.hibernate.engine.Cascade.cascadeToOne(Cascade.java:392) E at org.hibernate.engine.Cascade.cascadeAssociation(Cascade.java:335) E at org.hibernate.engine.Cascade.cascadeProperty(Cascade.java:204) E at org.hibernate.engine.Cascade.cascadeCollectionElements(Cascade.java:425) E at org.hibernate.engine.Cascade.cascadeCollection(Cascade.java:362) E at org.hibernate.engine.Cascade.cascadeAssociation(Cascade.java:338) E at org.hibernate.engine.Cascade.cascadeProperty(Cascade.java:204) E at org.hibernate.engine.Cascade.cascade(Cascade.java:161) E at org.hibernate.event.def.AbstractSaveEventListener.cascadeAfterSave(AbstractSaveEventListener.java:475) E at org.hibernate.event.def.DefaultMergeEventListener.mergeTransientEntity(DefaultMergeEventListener.java:388) E at org.hibernate.event.def.DefaultMergeEventListener.entityIsTransient(DefaultMergeEventListener.java:303) E at org.springframework.orm.hibernate3.support.IdTransferringMergeEventListener.entityIsTransient(IdTransferringMergeEventListener.java:62) E at ome.security.basic.MergeEventListener.entityIsTransient(MergeEventListener.java:154) E at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:258) E at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:87) E at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:84) E at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:73) E at org.hibernate.impl.SessionImpl.fireMerge(SessionImpl.java:867) E at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:851) E at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:855) E at ome.logic.UpdateImpl.internalMerge(UpdateImpl.java:313) E at ome.logic.UpdateImpl$2.run(UpdateImpl.java:138) E at ome.logic.UpdateImpl$2.run(UpdateImpl.java:135) E at ome.logic.UpdateImpl.doAction(UpdateImpl.java:357) E at ome.logic.UpdateImpl.doAction(UpdateImpl.java:349) E at ome.logic.UpdateImpl.saveAndReturnObject(UpdateImpl.java:135) E at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.EventHandler.invoke(EventHandler.java:154) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) E at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) E at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor683.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) E at ome.services.throttling.Callback.run(Callback.java:56) E at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) E at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) E at ome.services.blitz.impl.UpdateI.saveAndReturnObject_async(UpdateI.java:62) E at jdk.internal.reflect.GeneratedMethodAccessor682.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at omero.cmd.CallContext.invoke(CallContext.java:85) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy102.saveAndReturnObject_async(Unknown Source) E at omero.api._IUpdateTie.saveAndReturnObject_async(_IUpdateTie.java:92) E at omero.api._IUpdateDisp.___saveAndReturnObject(_IUpdateDisp.java:229) E at omero.api._IUpdateDisp.__dispatch(_IUpdateDisp.java:423) E at IceInternal.Incoming.invoke(Incoming.java:221) E at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) E at Ice.ConnectionI.dispatch(ConnectionI.java:1145) E at Ice.ConnectionI.message(ConnectionI.java:1056) E at IceInternal.ThreadPool.run(ThreadPool.java:395) E at IceInternal.ThreadPool.access$300(ThreadPool.java:12) E at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) E at java.base/java.lang.Thread.run(Thread.java:829) E E serverExceptionClass = ome.conditions.SecurityViolation E message = You are not authorized to set the ExperimenterGroup for ome.model.annotations.TagAnnotation:Id_41635 to ome.model.meta.ExperimenterGroup:Id_3678 E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_IUpdate_ice.py:163: SecurityViolation ______ TestPermissions.testUseOfRawFileBeanScriptReadCorrectGroupAndUser _______ self = @pytest.mark.broken(ticket="11539") def testUseOfRawFileBeanScriptReadCorrectGroupAndUser(self): > self.assertValidScript(lambda v: { 'omero.group': str(v.details.group.id.val), 'omero.user': str(v.details.owner.id.val) }) test/integration/test_permissions.py:831: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/integration/test_permissions.py:813: in assertValidScript store.setFileId(script.id.val, ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = 0427e68d-3256-48b1-a35c-d872b24ee0e2/d1f4ca97-56e3-45d5-9fca-3d2656e1ca4domero.api.RawFileStore -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 fileId = 1 _ctx = {'omero.client.uuid': '158203bf-98d9-43e1-94cd-3b07935f302d', 'omero.group': '1', 'omero.session.uuid': '0427e68d-3256-48b1-a35c-d872b24ee0e2', 'omero.user': '0'} def setFileId(self, fileId, _ctx=None): > return _M_omero.api.RawFileStore._op_setFileId.invoke(self, ((fileId, ), _ctx)) E Ice.UnknownException: exception ::Ice::UnknownException E { E unknown = ome.conditions.SecurityViolation: User 4735 is not an admin and so cannot set uid to 0 E at ome.security.basic.BasicEventContext.checkAndInitialize(BasicEventContext.java:141) E at ome.security.basic.CurrentDetails.checkAndInitialize(CurrentDetails.java:317) E at ome.security.basic.BasicSecuritySystem.loadEventContext(BasicSecuritySystem.java:449) E at ome.security.basic.EventHandler.doLogin(EventHandler.java:210) E at ome.security.basic.EventHandler.invoke(EventHandler.java:146) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) E at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) E at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy74.doWork(Unknown Source) E at ome.services.util.Executor$Impl.execute(Executor.java:447) E at ome.services.blitz.repo.RepositoryDaoImpl.getFile(RepositoryDaoImpl.java:866) E at ome.services.blitz.repo.PublicRepositoryI.checkId(PublicRepositoryI.java:823) E at ome.services.blitz.repo.PublicRepositoryI.fileById(PublicRepositoryI.java:367) E at omero.grid._RepositoryTie.fileById(_RepositoryTie.java:78) E at omero.grid._RepositoryDisp.___fileById(_RepositoryDisp.java:393) E at omero.grid._RepositoryDisp.__dispatch(_RepositoryDisp.java:538) E at IceInternal.Incoming.invoke(Incoming.java:221) E at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) E at Ice.ConnectionI.dispatch(ConnectionI.java:1145) E at Ice.ConnectionI.message(ConnectionI.java:1056) E at IceInternal.ThreadPool.run(ThreadPool.java:395) E at IceInternal.ThreadPool.access$300(ThreadPool.java:12) E at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) E at java.base/java.lang.Thread.run(Thread.java:829) E E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_RawFileStore_ice.py:200: UnknownException _________________________ TestRFS.testTicket1961Basic __________________________ self = @pytest.mark.broken(ticket="11534") def testTicket1961Basic(self): ofile = self.file() rfs = self.client.sf.createRawFileStore() rfs.setFileId(ofile.id.val) rfs.write([0, 1, 2, 3], 0, 4) rfs.close() > self.check_file(ofile) test/integration/test_rawfilestore.py:49: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ofile = object #0 (::omero::model::OriginalFile) { _id = object #1 (::omero::RLong) { _val = 6207 } _d...annotationLinksCountPerOwner = { } _name = object #26 (::omero::RString) { _val = test } } client = def check_file(self, ofile, client=None): if client is None: client = self.client query = client.sf.getQueryService() ofile = query.get("OriginalFile", ofile.id.val) assert ofile.size.val != -1 > assert ofile.hash.val != "" E AssertionError: assert '' != '' E + where '' = object #0 (::omero::RString)\n{\n _val = \n}.val E + where object #0 (::omero::RString)\n{\n _val = \n} = object #0 (::omero::model::OriginalFile)\n{\n _id = object #1 (::omero::RLong)\n {\n _val = 6207\n }\n _details = object #2 (::omero::model::Details)\n {\n _owner = object #3 (::omero::model::Experimenter)\n {\n _id = object #4 (::omero::RLong)\n {\n _val = 4736\n }\n _details = \n _loaded = False\n _version = \n _groupExperimenterMapSeq = \n {\n }\n _groupExperimenterMapLoaded = False\n _omeName = \n _firstName = \n _middleName = \n _lastName = \n _institution = \n _ldap = \n _email = \n _config = \n {\n }\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n }\n _group = object #5 (::omero::model::ExperimenterGroup)\n {\n _id = object #6 (::omero::RLong)\n {\n _val = 3680\n }\n _details = object #7 (::... }\n groupPermissions = object #20 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -120\n }\n }\n }\n _loaded = True\n _version = \n _pixelsFileMapsSeq = \n {\n }\n _pixelsFileMapsLoaded = False\n _pixelsFileMapsCountPerOwner = \n {\n }\n _path = object #21 (::omero::RString)\n {\n _val = /tmp/test\n }\n _repo = \n _size = object #22 (::omero::RLong)\n {\n _val = 4\n }\n _atime = \n _mtime = object #23 (::omero::RTime)\n {\n _val = 1729754059632\n }\n _ctime = \n _hasher = \n _hash = object #24 (::omero::RString)\n {\n _val = \n }\n _mimetype = object #25 (::omero::RString)\n {\n _val = application/octet-stream\n }\n _filesetEntriesSeq = \n {\n }\n _filesetEntriesLoaded = True\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n _name = object #26 (::omero::RString)\n {\n _val = test\n }\n}.hash test/integration/test_rawfilestore.py:40: AssertionError ____________________ TestRFS.testTicket1961WithKillSession _____________________ self = @pytest.mark.broken(ticket="11534") def testTicket1961WithKillSession(self): ofile = self.file() grp = self.ctx.groupName session = self.client.sf.getSessionService().createUserSession( 1 * 1000, 10000, grp) properties = self.client.getPropertyMap() c = omero.client(properties) s = c.joinSession(session.uuid.val) rfs = s.createRawFileStore() rfs.setFileId(ofile.id.val) rfs.write([0, 1, 2, 3], 0, 4) c.killSession() > self.check_file(ofile) test/integration/test_rawfilestore.py:67: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ofile = object #0 (::omero::model::OriginalFile) { _id = object #1 (::omero::RLong) { _val = 6208 } _d...annotationLinksCountPerOwner = { } _name = object #26 (::omero::RString) { _val = test } } client = def check_file(self, ofile, client=None): if client is None: client = self.client query = client.sf.getQueryService() ofile = query.get("OriginalFile", ofile.id.val) assert ofile.size.val != -1 > assert ofile.hash.val != "" E AssertionError: assert '' != '' E + where '' = object #0 (::omero::RString)\n{\n _val = \n}.val E + where object #0 (::omero::RString)\n{\n _val = \n} = object #0 (::omero::model::OriginalFile)\n{\n _id = object #1 (::omero::RLong)\n {\n _val = 6208\n }\n _details = object #2 (::omero::model::Details)\n {\n _owner = object #3 (::omero::model::Experimenter)\n {\n _id = object #4 (::omero::RLong)\n {\n _val = 4736\n }\n _details = \n _loaded = False\n _version = \n _groupExperimenterMapSeq = \n {\n }\n _groupExperimenterMapLoaded = False\n _omeName = \n _firstName = \n _middleName = \n _lastName = \n _institution = \n _ldap = \n _email = \n _config = \n {\n }\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n }\n _group = object #5 (::omero::model::ExperimenterGroup)\n {\n _id = object #6 (::omero::RLong)\n {\n _val = 3680\n }\n _details = object #7 (::... }\n groupPermissions = object #20 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -120\n }\n }\n }\n _loaded = True\n _version = \n _pixelsFileMapsSeq = \n {\n }\n _pixelsFileMapsLoaded = False\n _pixelsFileMapsCountPerOwner = \n {\n }\n _path = object #21 (::omero::RString)\n {\n _val = /tmp/test\n }\n _repo = \n _size = object #22 (::omero::RLong)\n {\n _val = 4\n }\n _atime = \n _mtime = object #23 (::omero::RTime)\n {\n _val = 1729754059714\n }\n _ctime = \n _hasher = \n _hash = object #24 (::omero::RString)\n {\n _val = \n }\n _mimetype = object #25 (::omero::RString)\n {\n _val = application/octet-stream\n }\n _filesetEntriesSeq = \n {\n }\n _filesetEntriesLoaded = True\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n _name = object #26 (::omero::RString)\n {\n _val = test\n }\n}.hash test/integration/test_rawfilestore.py:40: AssertionError __________________________ TestRFS.testTicket2161Save __________________________ self = @pytest.mark.broken(ticket="11534") def testTicket2161Save(self): ofile = self.file() rfs = self.client.sf.createRawFileStore() rfs.setFileId(ofile.id.val) rfs.write([0, 1, 2, 3], 0, 4) ofile = rfs.save() > self.check_file(ofile) test/integration/test_rawfilestore.py:76: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ofile = object #0 (::omero::model::OriginalFile) { _id = object #1 (::omero::RLong) { _val = 6209 } _d...annotationLinksCountPerOwner = { } _name = object #26 (::omero::RString) { _val = test } } client = def check_file(self, ofile, client=None): if client is None: client = self.client query = client.sf.getQueryService() ofile = query.get("OriginalFile", ofile.id.val) assert ofile.size.val != -1 > assert ofile.hash.val != "" E AssertionError: assert '' != '' E + where '' = object #0 (::omero::RString)\n{\n _val = \n}.val E + where object #0 (::omero::RString)\n{\n _val = \n} = object #0 (::omero::model::OriginalFile)\n{\n _id = object #1 (::omero::RLong)\n {\n _val = 6209\n }\n _details = object #2 (::omero::model::Details)\n {\n _owner = object #3 (::omero::model::Experimenter)\n {\n _id = object #4 (::omero::RLong)\n {\n _val = 4736\n }\n _details = \n _loaded = False\n _version = \n _groupExperimenterMapSeq = \n {\n }\n _groupExperimenterMapLoaded = False\n _omeName = \n _firstName = \n _middleName = \n _lastName = \n _institution = \n _ldap = \n _email = \n _config = \n {\n }\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n }\n _group = object #5 (::omero::model::ExperimenterGroup)\n {\n _id = object #6 (::omero::RLong)\n {\n _val = 3680\n }\n _details = object #7 (::... }\n groupPermissions = object #20 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -120\n }\n }\n }\n _loaded = True\n _version = \n _pixelsFileMapsSeq = \n {\n }\n _pixelsFileMapsLoaded = False\n _pixelsFileMapsCountPerOwner = \n {\n }\n _path = object #21 (::omero::RString)\n {\n _val = /tmp/test\n }\n _repo = \n _size = object #22 (::omero::RLong)\n {\n _val = 4\n }\n _atime = \n _mtime = object #23 (::omero::RTime)\n {\n _val = 1729754059760\n }\n _ctime = \n _hasher = \n _hash = object #24 (::omero::RString)\n {\n _val = \n }\n _mimetype = object #25 (::omero::RString)\n {\n _val = application/octet-stream\n }\n _filesetEntriesSeq = \n {\n }\n _filesetEntriesLoaded = True\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n _name = object #26 (::omero::RString)\n {\n _val = test\n }\n}.hash test/integration/test_rawfilestore.py:40: AssertionError _____________________________ TestRFS.testNoWrite ______________________________ self = @pytest.mark.broken(ticket="11534") def testNoWrite(self): group = self.new_group(perms="rwr---") client1 = self.new_client(group=group) client2 = self.new_client(group=group) ofile = self.file(client=client1) rfs = client1.sf.createRawFileStore() rfs.setFileId(ofile.id.val) rfs.write(b"0123", 0, 4) rfs.close() > self.check_file(ofile, client=client1) test/integration/test_rawfilestore.py:94: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = ofile = object #0 (::omero::model::OriginalFile) { _id = object #1 (::omero::RLong) { _val = 6210 } _d...annotationLinksCountPerOwner = { } _name = object #26 (::omero::RString) { _val = test } } client = def check_file(self, ofile, client=None): if client is None: client = self.client query = client.sf.getQueryService() ofile = query.get("OriginalFile", ofile.id.val) assert ofile.size.val != -1 > assert ofile.hash.val != "" E AssertionError: assert '' != '' E + where '' = object #0 (::omero::RString)\n{\n _val = \n}.val E + where object #0 (::omero::RString)\n{\n _val = \n} = object #0 (::omero::model::OriginalFile)\n{\n _id = object #1 (::omero::RLong)\n {\n _val = 6210\n }\n _details = object #2 (::omero::model::Details)\n {\n _owner = object #3 (::omero::model::Experimenter)\n {\n _id = object #4 (::omero::RLong)\n {\n _val = 4737\n }\n _details = \n _loaded = False\n _version = \n _groupExperimenterMapSeq = \n {\n }\n _groupExperimenterMapLoaded = False\n _omeName = \n _firstName = \n _middleName = \n _lastName = \n _institution = \n _ldap = \n _email = \n _config = \n {\n }\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n }\n _group = object #5 (::omero::model::ExperimenterGroup)\n {\n _id = object #6 (::omero::RLong)\n {\n _val = 3681\n }\n _details = object #7 (::... }\n groupPermissions = object #20 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -56\n }\n }\n }\n _loaded = True\n _version = \n _pixelsFileMapsSeq = \n {\n }\n _pixelsFileMapsLoaded = False\n _pixelsFileMapsCountPerOwner = \n {\n }\n _path = object #21 (::omero::RString)\n {\n _val = /tmp/test\n }\n _repo = \n _size = object #22 (::omero::RLong)\n {\n _val = 4\n }\n _atime = \n _mtime = object #23 (::omero::RTime)\n {\n _val = 1729754065593\n }\n _ctime = \n _hasher = \n _hash = object #24 (::omero::RString)\n {\n _val = \n }\n _mimetype = object #25 (::omero::RString)\n {\n _val = application/octet-stream\n }\n _filesetEntriesSeq = \n {\n }\n _filesetEntriesLoaded = True\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n _name = object #26 (::omero::RString)\n {\n _val = test\n }\n}.hash test/integration/test_rawfilestore.py:40: AssertionError __________________ TestRepoRawFileStore.testFailedWriteNoFile __________________ self = @pytest.mark.broken(ticket="11610") def testFailedWriteNoFile(self): # Without a single write, no file is produced rfs = self.repoPrx.file(self.repo_filename, "rw") # create empty file rfs.close() rfs = self.repoPrx.file(self.repo_filename, "r") with pytest.raises(omero.ResourceError): > rfs.size() E Failed: DID NOT RAISE test/integration/test_reporawfilestore.py:68: Failed ______________________ TestScripts.testAutoFillTicket2326 ______________________ self = @pytest.mark.broken(ticket="11539") def testAutoFillTicket2326(self): SCRIPT = """if True: import omero.scripts import omero.rtypes client = omero.scripts.client( "ticket2326", omero.scripts.Long("width", optional=True)) width = client.getInput("width") print(width) client.setOutput( "noWidthKey", omero.rtypes.rbool("width" not in client.getInputKeys())) client.setOutput("widthIsNull", omero.rtypes.rbool(width is None)) """ impl = omero.processor.usermode_processor(self.client) svc = self.client.sf.getScriptService() try: scriptID = svc.uploadScript("/test/testAutoFillTicket2326", SCRIPT) > process = svc.runScript(scriptID, {}, None) test/integration/test_scripts.py:378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = b436a087-6105-440c-b1a9-a841e3186003/c8dc2414-6967-46e6-b946-25e3b022bdb1omero.api.IScript -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 scriptID = 6215, inputs = {}, waitSecs = None, _ctx = None def runScript(self, scriptID, inputs, waitSecs, _ctx=None): > return _M_omero.api.IScript._op_runScript.invoke(self, ((scriptID, inputs, waitSecs), _ctx)) E omero.NoProcessorAvailable: exception ::omero::NoProcessorAvailable E { E serverStackTrace = E serverExceptionClass = E message = No processor available! [0 response(s)] E processorCount = 0 E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py:935: NoProcessorAvailable ------------------------------ Captured log call ------------------------------- INFO omero.util.Resources:__init__.py:652 Starting INFO omero.processor.ProcessorI:processor.py:814 Registering processor "4?!*d)kT%7RTBV@EBZhS/UsermodeProcessor-44947f69-a654-4e8c-970d-dbccea9497df" -t -e 1.1:tcp -h 127.0.0.1 -p 39915 -t 60000 INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4954 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4954 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4955 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4955 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: INFO omero.processor.ProcessorI:__init__.py:597 Cleaning up INFO omero.processor.ProcessorI:__init__.py:599 Done ______________ TestScripts.testParamLoadingPerformanceTicket2285 _______________ self = @pytest.mark.broken(reason="Minor performance failure", ticket="11539") def testParamLoadingPerformanceTicket2285(self): root_client = self.new_client(system=True) svc = root_client.sf.getScriptService() SCRIPT = """if True: import omero.model as OM import omero.rtypes as OR import omero.scripts as OS c = OS.client("perf test", OS.Long("a", min=0, max=5), OS.String("b", values=("a","b","c")), OS.List("c").ofType(OM.ImageI)) """ upload_time, scriptID = self.timeit( svc.uploadOfficialScript, "/test/perf%s.py" % self.uuid(), SCRIPT) impl = omero.processor.usermode_processor(root_client) try: params_time, params = self.timeit(svc.getParams, scriptID) assert params_time < (upload_time / 10), \ "upload_time(%s) <= 10 * params_time(%s)!" % \ (upload_time, params_time) assert params_time < 0.1, "params_time(%s) >= 0.01 !" % params_time > run_time, process = self.timeit( svc.runScript, scriptID, wrap({"a": int(5)}).val, None) test/integration/test_scripts.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../../.venv3/lib64/python3.9/site-packages/omero/testlib/__init__.py:612: in timeit rv = func(*args, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = a03e0bf2-270e-4fda-8cbe-d819f2e448db/1c5b6717-5e25-405b-8727-afca2b3ff2f2omero.api.IScript -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 scriptID = 6216, inputs = {'a': object #0 (::omero::RInt) { _val = 5 }} waitSecs = None, _ctx = None def runScript(self, scriptID, inputs, waitSecs, _ctx=None): > return _M_omero.api.IScript._op_runScript.invoke(self, ((scriptID, inputs, waitSecs), _ctx)) E omero.ValidationException: exception ::omero::ValidationException E { E serverStackTrace = E serverExceptionClass = E message = Invalid parameters: E WRONG TYPE for "a": != E E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py:935: ValidationException ------------------------------ Captured log call ------------------------------- INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4741, group=3684, script=4956 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 858, in willAccept handle.close() File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_ServicesF_ice.py", line 287, in close return _M_omero.api.StatefulServiceInterface._op_close.invoke(self, ((), _ctx)) omero.SecurityViolation: exception ::omero::SecurityViolation { serverStackTrace = ome.conditions.SecurityViolation: User 4740 is not a member of group 3684 and cannot login at ome.security.basic.BasicSecuritySystem.loadEventContext(BasicSecuritySystem.java:514) at ome.security.basic.EventHandler.doLogin(EventHandler.java:210) at ome.security.basic.EventHandler.invoke(EventHandler.java:146) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.SessionHandler.doStateful(SessionHandler.java:216) at ome.tools.hibernate.SessionHandler.invoke(SessionHandler.java:200) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy92.close(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor527.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy92.close(Unknown Source) at ome.services.blitz.impl.AbstractCloseableAmdServant.close_async(AbstractCloseableAmdServant.java:90) at jdk.internal.reflect.GeneratedMethodAccessor558.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy93.close_async(Unknown Source) at omero.api._JobHandleTie.close_async(_JobHandleTie.java:144) at omero.api._StatefulServiceInterfaceDisp.___close(_StatefulServiceInterfaceDisp.java:185) at omero.api._JobHandleDisp.__dispatch(_JobHandleDisp.java:597) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.SecurityViolation message = User 4740 is not a member of group 3684 and cannot login } INFO omero.remote:decorators.py:75 Rslt: INFO omero.remote:decorators.py:75 Rslt: INFO omero.util.Resources:__init__.py:652 Starting INFO omero.processor.ProcessorI:processor.py:814 Registering processor jrHWfd;[]\"]IZu$)Ck1M/UsermodeProcessor-3ae098f0-ddab-4fe3-8067-85c5ba120190 -t -e 1.1:tcp -h 127.0.0.1 -p 39915 -t 60000 INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4741, group=3684, script=4957 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 858, in willAccept handle.close() File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_ServicesF_ice.py", line 287, in close return _M_omero.api.StatefulServiceInterface._op_close.invoke(self, ((), _ctx)) omero.SecurityViolation: exception ::omero::SecurityViolation { serverStackTrace = ome.conditions.SecurityViolation: User 4740 is not a member of group 3684 and cannot login at ome.security.basic.BasicSecuritySystem.loadEventContext(BasicSecuritySystem.java:514) at ome.security.basic.EventHandler.doLogin(EventHandler.java:210) at ome.security.basic.EventHandler.invoke(EventHandler.java:146) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.SessionHandler.doStateful(SessionHandler.java:216) at ome.tools.hibernate.SessionHandler.invoke(SessionHandler.java:200) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy92.close(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor527.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy92.close(Unknown Source) at ome.services.blitz.impl.AbstractCloseableAmdServant.close_async(AbstractCloseableAmdServant.java:90) at jdk.internal.reflect.GeneratedMethodAccessor558.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy93.close_async(Unknown Source) at omero.api._JobHandleTie.close_async(_JobHandleTie.java:144) at omero.api._StatefulServiceInterfaceDisp.___close(_StatefulServiceInterfaceDisp.java:185) at omero.api._JobHandleDisp.__dispatch(_JobHandleDisp.java:597) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.SecurityViolation message = User 4740 is not a member of group 3684 and cannot login } INFO omero.remote:decorators.py:75 Rslt: ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4741, group=3684, script=4957 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: INFO omero.remote:decorators.py:75 Rslt: INFO omero.processor.ProcessorI:__init__.py:597 Cleaning up INFO omero.processor.ProcessorI:__init__.py:599 Done _____________________________ TestScripts.test3527 _____________________________ self = @pytest.mark.broken(ticket="11610") def test3527(self): SCRIPT = """if True: import omero.scripts import omero.rtypes client = omero.scripts.client("ticket3527", \ omero.scripts.Long("gid", out=True)) ec = client.sf.getAdminService().getEventContext() gid = ec.groupId client.setOutput("gid", omero.rtypes.rlong(gid)) """ impl = omero.processor.usermode_processor(self.client) svc = self.client.sf.getScriptService() try: scriptID = svc.uploadScript("/test/test3527", SCRIPT) > process = svc.runScript(scriptID, {}, None) test/integration/test_scripts.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = b436a087-6105-440c-b1a9-a841e3186003/c8dc2414-6967-46e6-b946-25e3b022bdb1omero.api.IScript -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 scriptID = 6218, inputs = {}, waitSecs = None, _ctx = None def runScript(self, scriptID, inputs, waitSecs, _ctx=None): > return _M_omero.api.IScript._op_runScript.invoke(self, ((scriptID, inputs, waitSecs), _ctx)) E omero.NoProcessorAvailable: exception ::omero::NoProcessorAvailable E { E serverStackTrace = E serverExceptionClass = E message = No processor available! [0 response(s)] E processorCount = 0 E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py:935: NoProcessorAvailable ------------------------------ Captured log call ------------------------------- INFO omero.util.Resources:__init__.py:652 Starting INFO omero.processor.ProcessorI:processor.py:814 Registering processor "4?!*d)kT%7RTBV@EBZhS/UsermodeProcessor-9ce8d19a-594a-4539-a705-5687053b1e34" -t -e 1.1:tcp -h 127.0.0.1 -p 39915 -t 60000 INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4958 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4958 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4958 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4958 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept INFO omero.remote:decorators.py:70 Meth: ProcessorI.willAccept ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4959 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4959 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4959 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: ERROR omero.processor.ProcessorI:processor.py:861 File lookup failed: user=4740, group=3683, script=4959 Traceback (most recent call last): File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 857, in willAccept file, handle = self.lookup(scriptContext) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 829, in lookup file = prx.validateScript(job, self.accepts_list) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/processor.py", line 40, in handler return func(*args, **kwargs) File "/home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero_api_IScript_ice.py", line 1065, in validateScript return _M_omero.api.IScript._op_validateScript.invoke(self, ((j, acceptsList), _ctx)) omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: Found wrong number of files: [] at ome.services.blitz.impl.ScriptI$15$1.doWork(ScriptI.java:577) at jdk.internal.reflect.GeneratedMethodAccessor267.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.services.util.Executor$Impl$Interceptor.invoke(Executor.java:568) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.impl.ScriptI$15.call(ScriptI.java:570) at ome.services.throttling.Callback2.run(Callback2.java:43) at ome.services.throttling.InThreadThrottlingStrategy.safeRunnableCall(InThreadThrottlingStrategy.java:80) at ome.services.blitz.impl.AbstractAmdServant.safeRunnableCall(AbstractAmdServant.java:159) at ome.services.blitz.impl.ScriptI.validateScript_async(ScriptI.java:553) at jdk.internal.reflect.GeneratedMethodAccessor597.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy94.validateScript_async(Unknown Source) at omero.api._IScriptTie.validateScript_async(_IScriptTie.java:144) at omero.api._IScriptDisp.___validateScript(_IScriptDisp.java:704) at omero.api._IScriptDisp.__dispatch(_IScriptDisp.java:819) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = Found wrong number of files: [] } INFO omero.remote:decorators.py:75 Rslt: INFO omero.processor.ProcessorI:__init__.py:597 Cleaning up INFO omero.processor.ProcessorI:__init__.py:599 Done _______________ TestThumbnailPerms.testPrivate10618RootWithNoCtx _______________ self = @pytest.mark.broken(reason="requires thumbnail work") def testPrivate10618RootWithNoCtx(self): """ This would require the server to try omero.group=-1 for the user. """ group = self.new_group(perms="rw----") > self.assert10618(group, self.root, True) test/integration/test_thumbnailPerms.py:256: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/integration/test_thumbnailPerms.py:217: in assert10618 s = tb_prx.getThumbnailByLongestSideSet(rint(16), [pId], *ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = 53fad1a3-6864-402b-8e17-7dc9c63f272b/5635ebfa-a5f1-4f55-8bb3-15cf85bac0bbomero.api.ThumbnailStore -t -e 1.1:tcp -h 172.18.0.10 -p 45809 -t 60000 size = object #0 (::omero::RInt) { _val = 16 }, pixelsIds = [2901] _ctx = None def getThumbnailByLongestSideSet(self, size, pixelsIds, _ctx=None): > return _M_omero.api.ThumbnailStore._op_getThumbnailByLongestSideSet.invoke(self, ((size, pixelsIds), _ctx)) E omero.ResourceError: exception ::omero::ResourceError E { E serverStackTrace = ome.conditions.ResourceError: Error retrieving Pixels id:2901. Pixels set does not exist or the user id:0 has insufficient permissions to retrieve it. E at ome.services.ThumbnailCtx.isExtendedGraphCritical(ThumbnailCtx.java:749) E at ome.services.ThumbnailCtx.createAndPrepareMissingRenderingSettings(ThumbnailCtx.java:388) E at ome.services.ThumbnailBean.getThumbnailByLongestSideSet(ThumbnailBean.java:1003) E at jdk.internal.reflect.GeneratedMethodAccessor3879.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.EventHandler.invoke(EventHandler.java:154) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.tools.hibernate.SessionHandler.doStateful(SessionHandler.java:216) E at ome.tools.hibernate.SessionHandler.invoke(SessionHandler.java:200) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) E at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) E at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy113.getThumbnailByLongestSideSet(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor3879.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy113.getThumbnailByLongestSideSet(Unknown Source) E at jdk.internal.reflect.GeneratedMethodAccessor3884.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) E at ome.services.throttling.Callback.run(Callback.java:56) E at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) E at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) E at ome.services.blitz.impl.ThumbnailStoreI.getThumbnailByLongestSideSet_async(ThumbnailStoreI.java:83) E at jdk.internal.reflect.GeneratedMethodAccessor3883.invoke(Unknown Source) E at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) E at java.base/java.lang.reflect.Method.invoke(Method.java:566) E at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) E at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) E at omero.cmd.CallContext.invoke(CallContext.java:85) E at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) E at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) E at com.sun.proxy.$Proxy114.getThumbnailByLongestSideSet_async(Unknown Source) E at omero.api._ThumbnailStoreTie.getThumbnailByLongestSideSet_async(_ThumbnailStoreTie.java:132) E at omero.api._ThumbnailStoreDisp.___getThumbnailByLongestSideSet(_ThumbnailStoreDisp.java:743) E at omero.api._ThumbnailStoreDisp.__dispatch(_ThumbnailStoreDisp.java:1088) E at IceInternal.Incoming.invoke(Incoming.java:221) E at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) E at Ice.ConnectionI.dispatch(ConnectionI.java:1145) E at Ice.ConnectionI.message(ConnectionI.java:1056) E at IceInternal.ThreadPool.run(ThreadPool.java:395) E at IceInternal.ThreadPool.access$300(ThreadPool.java:12) E at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) E at java.base/java.lang.Thread.run(Thread.java:829) E E serverExceptionClass = ome.conditions.ResourceError E message = Error retrieving Pixels id:2901. Pixels set does not exist or the user id:0 has insufficient permissions to retrieve it. E } ../../../../.venv3/lib64/python3.9/site-packages/omero_api_ThumbnailStore_ice.py:730: ResourceError ___________________________ TestTickets2000.test1184 ___________________________ self = @pytest.mark.broken(ticket="11543") def test1184(self): uuid = self.uuid() client = self.new_client(perms="rw----") query = client.sf.getQueryService() update = client.sf.getUpdateService() admin = client.sf.getAdminService() cont = client.sf.getContainerService() ds = self.new_dataset(name='test1184-ds-%s' % (uuid)) for i in range(1, 2001): img = self.new_image(name='img1184-%s' % (uuid)) ds.linkImage(img) ds = update.saveAndReturnObject(ds) c = cont.getCollectionCount( ds.__class__.__name__, ("imageLinks"), [ds.id.val], None) assert c[ds.id.val] == 2000 page = 1 p = omero.sys.Parameters() p.map = {} p.map["eid"] = rlong(admin.getEventContext().userId) p.map["oid"] = rlong(ds.id.val) if page is not None: f = omero.sys.Filter() f.limit = rint(24) f.offset = rint((int(page) - 1) * 24) p.theFilter = f sql = "select im from Image im join fetch im.details.owner " \ "join fetch im.details.group left outer join fetch " \ "im.datasetLinks dil left outer join fetch dil.parent d " \ "where d.id = :oid and im.details.owner.id=:eid " \ "order by im.id asc" start = time.time() res = query.findAllByQuery(sql, p) assert 24 == len(res) end = time.time() elapsed = end - start > assert elapsed < 3.0, \ "Expected the test to complete in < 3 seconds, took: %f" % elapsed E AssertionError: Expected the test to complete in < 3 seconds, took: 11.561142 E assert 11.561142206192017 < 3.0 test/integration/test_tickets2000.py:370: AssertionError =============================== warnings summary =============================== ../../../../.venv3/lib64/python3.9/site-packages/Ice.py:14 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/Ice.py:14: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses import sys, string, imp, os, threading, warnings, datetime ../../../../.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:241 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:241: RemovedInDjango50Warning: The default value of USE_TZ will change from False to True in Django 5.0. Set USE_TZ to False in your project settings if you want to keep the current default behavior. warnings.warn( ../../../../.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:289 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/django/conf/__init__.py:289: RemovedInDjango51Warning: The STATICFILES_STORAGE setting is deprecated. Use STORAGES instead. warnings.warn(STATICFILES_STORAGE_DEPRECATED_MSG, RemovedInDjango51Warning) ../../../../.venv3/lib64/python3.9/site-packages/pipeline/__init__.py:1 /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/pipeline/__init__.py:1: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html from pkg_resources import DistributionNotFound, get_distribution OmeroPy/test/integration/clitest/test_import.py::TestImport::testTargetInDifferentGroup[Dataset-test.fake--d] OmeroPy/test/integration/clitest/test_import.py::TestImport::testTargetInDifferentGroup[Screen-SPW&plates=1&plateRows=1&plateCols=1&fields=1&plateAcqs=1.fake--r] /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/plugins/sessions.py:176: DeprecationWarning: OMERO_SESSION_DIR is deprecated. Use OMERO_SESSIONDIR instead. warnings.warn( OmeroPy/test/integration/gatewaytest/test_multi_group.py::TestHistory::testCreateHistory OmeroPy/test/integration/gatewaytest/test_multi_group.py::TestHistory::testCreateHistory OmeroPy/test/integration/gatewaytest/test_multi_group.py::TestScript::testRunScript OmeroPy/test/integration/gatewaytest/test_multi_group.py::TestScript::testRunScript OmeroPy/test/integration/gatewaytest/test_multi_group.py::TestScript::testRunScript /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/gateway/__init__.py:4810: DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead logger.warn("%s on %s to <%s> %s(%r, %r)", OmeroPy/test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testPrivate10618RootWithNoCtx /home/omero/workspace/OMERO-test-integration/.venv3/lib64/python3.9/site-packages/omero/util/script_utils.py:1093: DeprecationWarning: tostring() is deprecated. Use tobytes() instead. converted_plane = byte_swapped_plane.tostring() -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroPy/target/reports/broken/junit-results.xml - =========================== short test summary info ============================ FAILED test/integration/clitest/test_fs.py::TestFS::testRenameAdminOnly - AssertionError: assert False + where False = ('SecurityViolation: Admins only!\n') + where = 'disabled since OMERO 5.4.7 due to Pixels.path bug\n'.endswith FAILED test/integration/clitest/test_import.py::TestImport::testTargetInDifferentGroup[Dataset-test.fake--d] - omero.cli.NonZeroReturnCode: assert failed FAILED test/integration/clitest/test_import.py::TestImport::testTargetInDifferentGroup[Screen-SPW&plates=1&plateRows=1&plateCols=1&fields=1&plateAcqs=1.fake--r] - omero.cli.NonZeroReturnCode: assert failed FAILED test/integration/gatewaytest/test_multi_group.py::TestHistory::testCreateHistory - AssertionError: Logs count should match number of objects assert 1 == 0 FAILED test/integration/gatewaytest/test_multi_group.py::TestScript::testRunScript - omero.NoProcessorAvailable: exception ::omero::NoProcessorAvailable { serverStackTrace = serverExceptionClass = message = No processor available! [0 response(s)] processorCount = 0 } FAILED test/integration/gatewaytest/test_performance.py::TestPerformance::testListFileAnnotations - AssertionError: Blitz listFileAnnotations() should be faster than getObjects('FileAnnotation') assert 0.3212752342224121 < 0.16168785095214844 FAILED test/integration/gatewaytest/test_user.py::TestUser::testCrossGroupRead - assert not 2719 in [2718, 1, 2730, 2719, 2780] + where 2719 = getId() + where getId = <_ExperimenterGroupWrapper id=2719>.getId + and [2718, 1, 2730, 2719, 2780] = object #0 (::omero::sys::EventContext)\n{\n shareId = -1\n sessionId = 10312\n sessionUuid = 9b756529-dc91-4678-b52c-11c942c2d2aa\n userId = 3728\n userName = weblitz_test_user\n sudoerId = \n sudoerName = \n groupId = 2718\n groupName = weblitz_test_user_group\n isAdmin = False\n adminPrivileges = \n {\n }\n eventId = -1\n eventType = Internal\n memberOfGroups = \n {\n [0] = 2718\n [1] = 1\n [2] = 2730\n [3] = 2719\n [4] = 2780\n }\n leaderOfGroups = \n {\n }\n groupPermissions = object #1 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -56\n }\n}.memberOfGroups + where object #0 (::omero::sys::EventContext)\n{\n shareId = -1\n sessionId = 10312\n sessionUuid = 9b756529-dc91-4678-b52c-11c942c2d2aa\n userId = 3728\n userName = weblitz_test_user\n sudoerId = \n sudoerName = \n groupId = 2718\n groupName = weblitz_test_user_group\n isAdmin = False\n adminPrivileges = \n {\n }\n eventId = -1\n eventType = Internal\n memberOfGroups = \n {\n [0] = 2718\n [1] = 1\n [2] = 2730\n [3] = 2719\n [4] = 2780\n }\n leaderOfGroups = \n {\n }\n groupPermissions = object #1 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -56\n }\n} = getEventContext() + where getEventContext = .getEventContext + where = .gateway FAILED test/integration/scriptstest/test_ping.py::TestPing::testProcessCallback - assert 0 > 0 + where 0 = len([]) + where [] = object #0 (::omero::grid::ProcessCallback)\n{\n}.finish FAILED test/integration/scriptstest/test_repo.py::TestScriptRepo::testGetGroupScripts - assert 6201 in [] FAILED test/integration/tablestest/test_service.py::TestTables::test2098 - AttributeError: type object 'path' has no attribute 'path' FAILED test/integration/tablestest/test_service.py::TestTables::testReadOnlyFile - AttributeError: 'TestTables' object has no attribute 'unique_dir' FAILED test/integration/tablestest/test_service.py::TestTables::testReadEqual - assert [1] == [1, 2, 3, 4, 5] Right contains 4 more items, first extra item: 2 Full diff: [ 1, - 2, - 3, - 4, - 5, ] FAILED test/integration/tablestest/test_service.py::TestTables::testReadOutOfRange - assert [4] == [-1, 0, 1, 2, 3, 4] At index 0 diff: 4 != -1 Right contains 5 more items, first extra item: 0 Full diff: [ - -1, - 0, - 1, - 2, - 3, 4, ] FAILED test/integration/test_admin.py::TestAdmin::testChangePasswordWhenUnset - omero.SecurityViolation: exception ::omero::SecurityViolation { serverStackTrace = ome.conditions.SecurityViolation: Old password is invalid at ome.logic.AdminImpl.changePasswordWithOldPassword(AdminImpl.java:1253) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.changePasswordWithOldPassword(Unknown Source) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.changePasswordWithOldPassword(Unknown Source) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.AdminI.changePasswordWithOldPassword_async(AdminI.java:144) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy97.changePasswordWithOldPassword_async(Unknown Source) at omero.api._IAdminTie.changePasswordWithOldPassword_async(_IAdminTie.java:112) at omero.api._IAdminDisp.___changePasswordWithOldPassword(_IAdminDisp.java:1977) at omero.api._IAdminDisp.__dispatch(_IAdminDisp.java:2229) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.SecurityViolation message = Old password is invalid } FAILED test/integration/test_admin.py::TestAdmin::test9193 - omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = ome.conditions.ValidationException: experimenter cannot be a member of only the 'user' group, a different default group is also required at ome.logic.AdminImpl.removeGroups(AdminImpl.java:813) at jdk.internal.reflect.GeneratedMethodAccessor627.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.removeGroups(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor627.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy96.removeGroups(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor3269.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.AdminI.removeGroups_async(AdminI.java:318) at jdk.internal.reflect.GeneratedMethodAccessor3268.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy97.removeGroups_async(Unknown Source) at omero.api._IAdminTie.removeGroups_async(_IAdminTie.java:312) at omero.api._IAdminDisp.___removeGroups(_IAdminDisp.java:1621) at omero.api._IAdminDisp.__dispatch(_IAdminDisp.java:2377) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ValidationException message = experimenter cannot be a member of only the 'user' group, a different default group is also required } FAILED test/integration/test_files.py::TestFiles::testUploadDifferentSizeTicket2337 - omero.OptimisticLockException: exception ::omero::OptimisticLockException { serverStackTrace = ome.conditions.OptimisticLockException: You are not authorized to change the update event for ome.model.core.OriginalFile:Id_6206 from ome.model.meta.Event:Id_124677 to ome.model.meta.Event:Id_124676 You may need to reload the object before continuing. at ome.security.basic.OmeroInterceptor.managedEvent(OmeroInterceptor.java:1201) at ome.security.basic.OmeroInterceptor.checkManagedDetails(OmeroInterceptor.java:963) at ome.security.basic.OmeroInterceptor.resetDetails(OmeroInterceptor.java:465) at ome.security.basic.OmeroInterceptor.onFlushDirty(OmeroInterceptor.java:239) at org.hibernate.event.def.DefaultFlushEntityEventListener.invokeInterceptor(DefaultFlushEntityEventListener.java:372) at org.hibernate.event.def.DefaultFlushEntityEventListener.handleInterception(DefaultFlushEntityEventListener.java:349) at org.hibernate.event.def.DefaultFlushEntityEventListener.scheduleUpdate(DefaultFlushEntityEventListener.java:287) at org.hibernate.event.def.DefaultFlushEntityEventListener.onFlushEntity(DefaultFlushEntityEventListener.java:155) at org.hibernate.event.def.AbstractFlushingEventListener.flushEntities(AbstractFlushingEventListener.java:219) at org.hibernate.event.def.AbstractFlushingEventListener.flushEverythingToExecutions(AbstractFlushingEventListener.java:99) at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:50) at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1216) at ome.logic.UpdateImpl.afterUpdate(UpdateImpl.java:342) at ome.logic.UpdateImpl.doAction(UpdateImpl.java:358) at ome.logic.UpdateImpl.doAction(UpdateImpl.java:349) at ome.logic.UpdateImpl.saveAndReturnObject(UpdateImpl.java:135) at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor683.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.UpdateI.saveAndReturnObject_async(UpdateI.java:62) at jdk.internal.reflect.GeneratedMethodAccessor682.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy102.saveAndReturnObject_async(Unknown Source) at omero.api._IUpdateTie.saveAndReturnObject_async(_IUpdateTie.java:92) at omero.api._IUpdateDisp.___saveAndReturnObject(_IUpdateDisp.java:229) at omero.api._IUpdateDisp.__dispatch(_IUpdateDisp.java:423) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.OptimisticLockException message = You are not authorized to change the update event for ome.model.core.OriginalFile:Id_6206 from ome.model.meta.Event:Id_124677 to ome.model.meta.Event:Id_124676 You may need to reload the object before continuing. backOff = 0 } FAILED test/integration/test_ishare.py::TestIShare::test1172 - assert 0 == 1 + where 0 = len([]) FAILED test/integration/test_itimeline.py::TestITimeline::test1225 - assert 6 == 10 + where 6 = len({1908, 1944, 1995, 9450, 13072, 41480}) FAILED test/integration/test_permissions.py::TestPermissions::test3136 - AssertionError: elapsed1=0.03419137001037598, elapsed2=0.01781940460205078 assert 0.03419137001037598 < (0.1 * 0.01781940460205078) FAILED test/integration/test_permissions.py::TestPermissions::testSaveWithNegOneExplicit - omero.ApiUsageException: exception ::omero::ApiUsageException { serverStackTrace = ome.conditions.ApiUsageException: No valid permissions available! DUMMY permissions are not intended for copying. Make sure that you have not passed omero.group=-1 for a save without context at ome.model.internal.Permissions.(Permissions.java:164) at ome.security.basic.CurrentDetails.createDetails(CurrentDetails.java:439) at ome.security.basic.OmeroInterceptor.newTransientDetails(OmeroInterceptor.java:700) at ome.security.basic.OmeroInterceptor.onSave(OmeroInterceptor.java:187) at org.hibernate.event.def.AbstractSaveEventListener.substituteValuesIfNecessary(AbstractSaveEventListener.java:413) at org.hibernate.event.def.AbstractSaveEventListener.performSaveOrReplicate(AbstractSaveEventListener.java:292) at org.hibernate.event.def.AbstractSaveEventListener.performSave(AbstractSaveEventListener.java:203) at org.hibernate.event.def.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:143) at org.hibernate.event.def.DefaultMergeEventListener.saveTransientEntity(DefaultMergeEventListener.java:415) at org.hibernate.event.def.DefaultMergeEventListener.mergeTransientEntity(DefaultMergeEventListener.java:341) at org.hibernate.event.def.DefaultMergeEventListener.entityIsTransient(DefaultMergeEventListener.java:303) at org.springframework.orm.hibernate3.support.IdTransferringMergeEventListener.entityIsTransient(IdTransferringMergeEventListener.java:62) at ome.security.basic.MergeEventListener.entityIsTransient(MergeEventListener.java:154) at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:258) at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:87) at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:84) at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:73) at org.hibernate.impl.SessionImpl.fireMerge(SessionImpl.java:867) at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:851) at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:855) at ome.logic.UpdateImpl.internalMerge(UpdateImpl.java:313) at ome.logic.UpdateImpl$2.run(UpdateImpl.java:138) at ome.logic.UpdateImpl$2.run(UpdateImpl.java:135) at ome.logic.UpdateImpl.doAction(UpdateImpl.java:357) at ome.logic.UpdateImpl.doAction(UpdateImpl.java:349) at ome.logic.UpdateImpl.saveAndReturnObject(UpdateImpl.java:135) at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor683.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.UpdateI.saveAndReturnObject_async(UpdateI.java:62) at jdk.internal.reflect.GeneratedMethodAccessor682.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy102.saveAndReturnObject_async(Unknown Source) at omero.api._IUpdateTie.saveAndReturnObject_async(_IUpdateTie.java:92) at omero.api._IUpdateDisp.___saveAndReturnObject(_IUpdateDisp.java:229) at omero.api._IUpdateDisp.__dispatch(_IUpdateDisp.java:423) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ApiUsageException message = No valid permissions available! DUMMY permissions are not intended for copying. Make sure that you have not passed omero.group=-1 for a save without context } FAILED test/integration/test_permissions.py::TestPermissions::testSaveWithNegBadLink - omero.ApiUsageException: exception ::omero::ApiUsageException { serverStackTrace = ome.conditions.ApiUsageException: No valid permissions available! DUMMY permissions are not intended for copying. Make sure that you have not passed omero.group=-1 for a save without context at ome.model.internal.Permissions.(Permissions.java:164) at ome.security.basic.CurrentDetails.createDetails(CurrentDetails.java:439) at ome.security.basic.OmeroInterceptor.newTransientDetails(OmeroInterceptor.java:700) at ome.security.basic.OmeroInterceptor.onSave(OmeroInterceptor.java:187) at org.hibernate.event.def.AbstractSaveEventListener.substituteValuesIfNecessary(AbstractSaveEventListener.java:413) at org.hibernate.event.def.AbstractSaveEventListener.performSaveOrReplicate(AbstractSaveEventListener.java:292) at org.hibernate.event.def.AbstractSaveEventListener.performSave(AbstractSaveEventListener.java:203) at org.hibernate.event.def.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:143) at org.hibernate.event.def.DefaultMergeEventListener.saveTransientEntity(DefaultMergeEventListener.java:415) at org.hibernate.event.def.DefaultMergeEventListener.mergeTransientEntity(DefaultMergeEventListener.java:341) at org.hibernate.event.def.DefaultMergeEventListener.entityIsTransient(DefaultMergeEventListener.java:303) at org.springframework.orm.hibernate3.support.IdTransferringMergeEventListener.entityIsTransient(IdTransferringMergeEventListener.java:62) at ome.security.basic.MergeEventListener.entityIsTransient(MergeEventListener.java:154) at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:258) at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:87) at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:84) at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:73) at org.hibernate.impl.SessionImpl.fireMerge(SessionImpl.java:867) at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:851) at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:855) at ome.logic.UpdateImpl.internalMerge(UpdateImpl.java:313) at ome.logic.UpdateImpl$2.run(UpdateImpl.java:138) at ome.logic.UpdateImpl$2.run(UpdateImpl.java:135) at ome.logic.UpdateImpl.doAction(UpdateImpl.java:357) at ome.logic.UpdateImpl.doAction(UpdateImpl.java:349) at ome.logic.UpdateImpl.saveAndReturnObject(UpdateImpl.java:135) at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor683.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.UpdateI.saveAndReturnObject_async(UpdateI.java:62) at jdk.internal.reflect.GeneratedMethodAccessor682.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy102.saveAndReturnObject_async(Unknown Source) at omero.api._IUpdateTie.saveAndReturnObject_async(_IUpdateTie.java:92) at omero.api._IUpdateDisp.___saveAndReturnObject(_IUpdateDisp.java:229) at omero.api._IUpdateDisp.__dispatch(_IUpdateDisp.java:423) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ApiUsageException message = No valid permissions available! DUMMY permissions are not intended for copying. Make sure that you have not passed omero.group=-1 for a save without context } FAILED test/integration/test_permissions.py::TestPermissions::testSaveBadLink - omero.SecurityViolation: exception ::omero::SecurityViolation { serverStackTrace = ome.conditions.SecurityViolation: You are not authorized to set the ExperimenterGroup for ome.model.annotations.TagAnnotation:Id_41635 to ome.model.meta.ExperimenterGroup:Id_3678 at ome.security.basic.OmeroInterceptor.newTransientDetails(OmeroInterceptor.java:785) at ome.security.basic.OmeroInterceptor.onSave(OmeroInterceptor.java:187) at org.hibernate.event.def.AbstractSaveEventListener.substituteValuesIfNecessary(AbstractSaveEventListener.java:413) at org.hibernate.event.def.AbstractSaveEventListener.performSaveOrReplicate(AbstractSaveEventListener.java:292) at org.hibernate.event.def.AbstractSaveEventListener.performSave(AbstractSaveEventListener.java:203) at org.hibernate.event.def.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:143) at org.hibernate.event.def.DefaultMergeEventListener.saveTransientEntity(DefaultMergeEventListener.java:415) at org.hibernate.event.def.DefaultMergeEventListener.mergeTransientEntity(DefaultMergeEventListener.java:341) at org.hibernate.event.def.DefaultMergeEventListener.entityIsTransient(DefaultMergeEventListener.java:303) at org.springframework.orm.hibernate3.support.IdTransferringMergeEventListener.entityIsTransient(IdTransferringMergeEventListener.java:62) at ome.security.basic.MergeEventListener.entityIsTransient(MergeEventListener.java:154) at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:258) at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:87) at org.hibernate.impl.SessionImpl.fireMerge(SessionImpl.java:877) at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:859) at org.hibernate.engine.CascadingAction$6.cascade(CascadingAction.java:279) at org.hibernate.engine.Cascade.cascadeToOne(Cascade.java:392) at org.hibernate.engine.Cascade.cascadeAssociation(Cascade.java:335) at org.hibernate.engine.Cascade.cascadeProperty(Cascade.java:204) at org.hibernate.engine.Cascade.cascade(Cascade.java:161) at org.hibernate.event.def.AbstractSaveEventListener.cascadeBeforeSave(AbstractSaveEventListener.java:450) at org.hibernate.event.def.DefaultMergeEventListener.mergeTransientEntity(DefaultMergeEventListener.java:336) at org.hibernate.event.def.DefaultMergeEventListener.entityIsTransient(DefaultMergeEventListener.java:303) at org.springframework.orm.hibernate3.support.IdTransferringMergeEventListener.entityIsTransient(IdTransferringMergeEventListener.java:62) at ome.security.basic.MergeEventListener.entityIsTransient(MergeEventListener.java:154) at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:258) at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:87) at org.hibernate.impl.SessionImpl.fireMerge(SessionImpl.java:877) at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:859) at org.hibernate.engine.CascadingAction$6.cascade(CascadingAction.java:279) at org.hibernate.engine.Cascade.cascadeToOne(Cascade.java:392) at org.hibernate.engine.Cascade.cascadeAssociation(Cascade.java:335) at org.hibernate.engine.Cascade.cascadeProperty(Cascade.java:204) at org.hibernate.engine.Cascade.cascadeCollectionElements(Cascade.java:425) at org.hibernate.engine.Cascade.cascadeCollection(Cascade.java:362) at org.hibernate.engine.Cascade.cascadeAssociation(Cascade.java:338) at org.hibernate.engine.Cascade.cascadeProperty(Cascade.java:204) at org.hibernate.engine.Cascade.cascade(Cascade.java:161) at org.hibernate.event.def.AbstractSaveEventListener.cascadeAfterSave(AbstractSaveEventListener.java:475) at org.hibernate.event.def.DefaultMergeEventListener.mergeTransientEntity(DefaultMergeEventListener.java:388) at org.hibernate.event.def.DefaultMergeEventListener.entityIsTransient(DefaultMergeEventListener.java:303) at org.springframework.orm.hibernate3.support.IdTransferringMergeEventListener.entityIsTransient(IdTransferringMergeEventListener.java:62) at ome.security.basic.MergeEventListener.entityIsTransient(MergeEventListener.java:154) at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:258) at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:87) at org.hibernate.event.def.DefaultMergeEventListener.onMerge(DefaultMergeEventListener.java:84) at ome.security.basic.MergeEventListener.onMerge(MergeEventListener.java:73) at org.hibernate.impl.SessionImpl.fireMerge(SessionImpl.java:867) at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:851) at org.hibernate.impl.SessionImpl.merge(SessionImpl.java:855) at ome.logic.UpdateImpl.internalMerge(UpdateImpl.java:313) at ome.logic.UpdateImpl$2.run(UpdateImpl.java:138) at ome.logic.UpdateImpl$2.run(UpdateImpl.java:135) at ome.logic.UpdateImpl.doAction(UpdateImpl.java:357) at ome.logic.UpdateImpl.doAction(UpdateImpl.java:349) at ome.logic.UpdateImpl.saveAndReturnObject(UpdateImpl.java:135) at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor609.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy101.saveAndReturnObject(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor683.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.UpdateI.saveAndReturnObject_async(UpdateI.java:62) at jdk.internal.reflect.GeneratedMethodAccessor682.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy102.saveAndReturnObject_async(Unknown Source) at omero.api._IUpdateTie.saveAndReturnObject_async(_IUpdateTie.java:92) at omero.api._IUpdateDisp.___saveAndReturnObject(_IUpdateDisp.java:229) at omero.api._IUpdateDisp.__dispatch(_IUpdateDisp.java:423) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.SecurityViolation message = You are not authorized to set the ExperimenterGroup for ome.model.annotations.TagAnnotation:Id_41635 to ome.model.meta.ExperimenterGroup:Id_3678 } FAILED test/integration/test_permissions.py::TestPermissions::testUseOfRawFileBeanScriptReadCorrectGroupAndUser - Ice.UnknownException: exception ::Ice::UnknownException { unknown = ome.conditions.SecurityViolation: User 4735 is not an admin and so cannot set uid to 0 at ome.security.basic.BasicEventContext.checkAndInitialize(BasicEventContext.java:141) at ome.security.basic.CurrentDetails.checkAndInitialize(CurrentDetails.java:317) at ome.security.basic.BasicSecuritySystem.loadEventContext(BasicSecuritySystem.java:449) at ome.security.basic.EventHandler.doLogin(EventHandler.java:210) at ome.security.basic.EventHandler.invoke(EventHandler.java:146) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.orm.hibernate3.HibernateInterceptor.invoke(HibernateInterceptor.java:119) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy74.doWork(Unknown Source) at ome.services.util.Executor$Impl.execute(Executor.java:447) at ome.services.blitz.repo.RepositoryDaoImpl.getFile(RepositoryDaoImpl.java:866) at ome.services.blitz.repo.PublicRepositoryI.checkId(PublicRepositoryI.java:823) at ome.services.blitz.repo.PublicRepositoryI.fileById(PublicRepositoryI.java:367) at omero.grid._RepositoryTie.fileById(_RepositoryTie.java:78) at omero.grid._RepositoryDisp.___fileById(_RepositoryDisp.java:393) at omero.grid._RepositoryDisp.__dispatch(_RepositoryDisp.java:538) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) } FAILED test/integration/test_rawfilestore.py::TestRFS::testTicket1961Basic - AssertionError: assert '' != '' + where '' = object #0 (::omero::RString)\n{\n _val = \n}.val + where object #0 (::omero::RString)\n{\n _val = \n} = object #0 (::omero::model::OriginalFile)\n{\n _id = object #1 (::omero::RLong)\n {\n _val = 6207\n }\n _details = object #2 (::omero::model::Details)\n {\n _owner = object #3 (::omero::model::Experimenter)\n {\n _id = object #4 (::omero::RLong)\n {\n _val = 4736\n }\n _details = \n _loaded = False\n _version = \n _groupExperimenterMapSeq = \n {\n }\n _groupExperimenterMapLoaded = False\n _omeName = \n _firstName = \n _middleName = \n _lastName = \n _institution = \n _ldap = \n _email = \n _config = \n {\n }\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n }\n _group = object #5 (::omero::model::ExperimenterGroup)\n {\n _id = object #6 (::omero::RLong)\n {\n _val = 3680\n }\n _details = object #7 (::... }\n groupPermissions = object #20 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -120\n }\n }\n }\n _loaded = True\n _version = \n _pixelsFileMapsSeq = \n {\n }\n _pixelsFileMapsLoaded = False\n _pixelsFileMapsCountPerOwner = \n {\n }\n _path = object #21 (::omero::RString)\n {\n _val = /tmp/test\n }\n _repo = \n _size = object #22 (::omero::RLong)\n {\n _val = 4\n }\n _atime = \n _mtime = object #23 (::omero::RTime)\n {\n _val = 1729754059632\n }\n _ctime = \n _hasher = \n _hash = object #24 (::omero::RString)\n {\n _val = \n }\n _mimetype = object #25 (::omero::RString)\n {\n _val = application/octet-stream\n }\n _filesetEntriesSeq = \n {\n }\n _filesetEntriesLoaded = True\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n _name = object #26 (::omero::RString)\n {\n _val = test\n }\n}.hash FAILED test/integration/test_rawfilestore.py::TestRFS::testTicket1961WithKillSession - AssertionError: assert '' != '' + where '' = object #0 (::omero::RString)\n{\n _val = \n}.val + where object #0 (::omero::RString)\n{\n _val = \n} = object #0 (::omero::model::OriginalFile)\n{\n _id = object #1 (::omero::RLong)\n {\n _val = 6208\n }\n _details = object #2 (::omero::model::Details)\n {\n _owner = object #3 (::omero::model::Experimenter)\n {\n _id = object #4 (::omero::RLong)\n {\n _val = 4736\n }\n _details = \n _loaded = False\n _version = \n _groupExperimenterMapSeq = \n {\n }\n _groupExperimenterMapLoaded = False\n _omeName = \n _firstName = \n _middleName = \n _lastName = \n _institution = \n _ldap = \n _email = \n _config = \n {\n }\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n }\n _group = object #5 (::omero::model::ExperimenterGroup)\n {\n _id = object #6 (::omero::RLong)\n {\n _val = 3680\n }\n _details = object #7 (::... }\n groupPermissions = object #20 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -120\n }\n }\n }\n _loaded = True\n _version = \n _pixelsFileMapsSeq = \n {\n }\n _pixelsFileMapsLoaded = False\n _pixelsFileMapsCountPerOwner = \n {\n }\n _path = object #21 (::omero::RString)\n {\n _val = /tmp/test\n }\n _repo = \n _size = object #22 (::omero::RLong)\n {\n _val = 4\n }\n _atime = \n _mtime = object #23 (::omero::RTime)\n {\n _val = 1729754059714\n }\n _ctime = \n _hasher = \n _hash = object #24 (::omero::RString)\n {\n _val = \n }\n _mimetype = object #25 (::omero::RString)\n {\n _val = application/octet-stream\n }\n _filesetEntriesSeq = \n {\n }\n _filesetEntriesLoaded = True\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n _name = object #26 (::omero::RString)\n {\n _val = test\n }\n}.hash FAILED test/integration/test_rawfilestore.py::TestRFS::testTicket2161Save - AssertionError: assert '' != '' + where '' = object #0 (::omero::RString)\n{\n _val = \n}.val + where object #0 (::omero::RString)\n{\n _val = \n} = object #0 (::omero::model::OriginalFile)\n{\n _id = object #1 (::omero::RLong)\n {\n _val = 6209\n }\n _details = object #2 (::omero::model::Details)\n {\n _owner = object #3 (::omero::model::Experimenter)\n {\n _id = object #4 (::omero::RLong)\n {\n _val = 4736\n }\n _details = \n _loaded = False\n _version = \n _groupExperimenterMapSeq = \n {\n }\n _groupExperimenterMapLoaded = False\n _omeName = \n _firstName = \n _middleName = \n _lastName = \n _institution = \n _ldap = \n _email = \n _config = \n {\n }\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n }\n _group = object #5 (::omero::model::ExperimenterGroup)\n {\n _id = object #6 (::omero::RLong)\n {\n _val = 3680\n }\n _details = object #7 (::... }\n groupPermissions = object #20 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -120\n }\n }\n }\n _loaded = True\n _version = \n _pixelsFileMapsSeq = \n {\n }\n _pixelsFileMapsLoaded = False\n _pixelsFileMapsCountPerOwner = \n {\n }\n _path = object #21 (::omero::RString)\n {\n _val = /tmp/test\n }\n _repo = \n _size = object #22 (::omero::RLong)\n {\n _val = 4\n }\n _atime = \n _mtime = object #23 (::omero::RTime)\n {\n _val = 1729754059760\n }\n _ctime = \n _hasher = \n _hash = object #24 (::omero::RString)\n {\n _val = \n }\n _mimetype = object #25 (::omero::RString)\n {\n _val = application/octet-stream\n }\n _filesetEntriesSeq = \n {\n }\n _filesetEntriesLoaded = True\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n _name = object #26 (::omero::RString)\n {\n _val = test\n }\n}.hash FAILED test/integration/test_rawfilestore.py::TestRFS::testNoWrite - AssertionError: assert '' != '' + where '' = object #0 (::omero::RString)\n{\n _val = \n}.val + where object #0 (::omero::RString)\n{\n _val = \n} = object #0 (::omero::model::OriginalFile)\n{\n _id = object #1 (::omero::RLong)\n {\n _val = 6210\n }\n _details = object #2 (::omero::model::Details)\n {\n _owner = object #3 (::omero::model::Experimenter)\n {\n _id = object #4 (::omero::RLong)\n {\n _val = 4737\n }\n _details = \n _loaded = False\n _version = \n _groupExperimenterMapSeq = \n {\n }\n _groupExperimenterMapLoaded = False\n _omeName = \n _firstName = \n _middleName = \n _lastName = \n _institution = \n _ldap = \n _email = \n _config = \n {\n }\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n }\n _group = object #5 (::omero::model::ExperimenterGroup)\n {\n _id = object #6 (::omero::RLong)\n {\n _val = 3681\n }\n _details = object #7 (::... }\n groupPermissions = object #20 (::omero::model::Permissions)\n {\n _restrictions = \n {\n }\n _extendedRestrictions = \n {\n }\n _perm1 = -56\n }\n }\n }\n _loaded = True\n _version = \n _pixelsFileMapsSeq = \n {\n }\n _pixelsFileMapsLoaded = False\n _pixelsFileMapsCountPerOwner = \n {\n }\n _path = object #21 (::omero::RString)\n {\n _val = /tmp/test\n }\n _repo = \n _size = object #22 (::omero::RLong)\n {\n _val = 4\n }\n _atime = \n _mtime = object #23 (::omero::RTime)\n {\n _val = 1729754065593\n }\n _ctime = \n _hasher = \n _hash = object #24 (::omero::RString)\n {\n _val = \n }\n _mimetype = object #25 (::omero::RString)\n {\n _val = application/octet-stream\n }\n _filesetEntriesSeq = \n {\n }\n _filesetEntriesLoaded = True\n _annotationLinksSeq = \n {\n }\n _annotationLinksLoaded = False\n _annotationLinksCountPerOwner = \n {\n }\n _name = object #26 (::omero::RString)\n {\n _val = test\n }\n}.hash FAILED test/integration/test_reporawfilestore.py::TestRepoRawFileStore::testFailedWriteNoFile - Failed: DID NOT RAISE FAILED test/integration/test_scripts.py::TestScripts::testAutoFillTicket2326 - omero.NoProcessorAvailable: exception ::omero::NoProcessorAvailable { serverStackTrace = serverExceptionClass = message = No processor available! [0 response(s)] processorCount = 0 } FAILED test/integration/test_scripts.py::TestScripts::testParamLoadingPerformanceTicket2285 - omero.ValidationException: exception ::omero::ValidationException { serverStackTrace = serverExceptionClass = message = Invalid parameters: WRONG TYPE for "a": != } FAILED test/integration/test_scripts.py::TestScripts::test3527 - omero.NoProcessorAvailable: exception ::omero::NoProcessorAvailable { serverStackTrace = serverExceptionClass = message = No processor available! [0 response(s)] processorCount = 0 } FAILED test/integration/test_thumbnailPerms.py::TestThumbnailPerms::testPrivate10618RootWithNoCtx - omero.ResourceError: exception ::omero::ResourceError { serverStackTrace = ome.conditions.ResourceError: Error retrieving Pixels id:2901. Pixels set does not exist or the user id:0 has insufficient permissions to retrieve it. at ome.services.ThumbnailCtx.isExtendedGraphCritical(ThumbnailCtx.java:749) at ome.services.ThumbnailCtx.createAndPrepareMissingRenderingSettings(ThumbnailCtx.java:388) at ome.services.ThumbnailBean.getThumbnailByLongestSideSet(ThumbnailBean.java:1003) at jdk.internal.reflect.GeneratedMethodAccessor3879.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.EventHandler.invoke(EventHandler.java:154) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.SessionHandler.doStateful(SessionHandler.java:216) at ome.tools.hibernate.SessionHandler.invoke(SessionHandler.java:200) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.tools.hibernate.ProxyCleanupFilter$Interceptor.invoke(ProxyCleanupFilter.java:249) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.util.ServiceHandler.invoke(ServiceHandler.java:121) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy113.getThumbnailByLongestSideSet(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor3879.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at ome.security.basic.BasicSecurityWiring.invoke(BasicSecurityWiring.java:93) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at ome.services.blitz.fire.AopContextInitializer.invoke(AopContextInitializer.java:43) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy113.getThumbnailByLongestSideSet(Unknown Source) at jdk.internal.reflect.GeneratedMethodAccessor3884.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at ome.services.blitz.util.IceMethodInvoker.invoke(IceMethodInvoker.java:172) at ome.services.throttling.Callback.run(Callback.java:56) at ome.services.throttling.InThreadThrottlingStrategy.callInvokerOnRawArgs(InThreadThrottlingStrategy.java:56) at ome.services.blitz.impl.AbstractAmdServant.callInvokerOnRawArgs(AbstractAmdServant.java:140) at ome.services.blitz.impl.ThumbnailStoreI.getThumbnailByLongestSideSet_async(ThumbnailStoreI.java:83) at jdk.internal.reflect.GeneratedMethodAccessor3883.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at omero.cmd.CallContext.invoke(CallContext.java:85) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) at com.sun.proxy.$Proxy114.getThumbnailByLongestSideSet_async(Unknown Source) at omero.api._ThumbnailStoreTie.getThumbnailByLongestSideSet_async(_ThumbnailStoreTie.java:132) at omero.api._ThumbnailStoreDisp.___getThumbnailByLongestSideSet(_ThumbnailStoreDisp.java:743) at omero.api._ThumbnailStoreDisp.__dispatch(_ThumbnailStoreDisp.java:1088) at IceInternal.Incoming.invoke(Incoming.java:221) at Ice.ConnectionI.invokeAll(ConnectionI.java:2536) at Ice.ConnectionI.dispatch(ConnectionI.java:1145) at Ice.ConnectionI.message(ConnectionI.java:1056) at IceInternal.ThreadPool.run(ThreadPool.java:395) at IceInternal.ThreadPool.access$300(ThreadPool.java:12) at IceInternal.ThreadPool$EventHandlerThread.run(ThreadPool.java:832) at java.base/java.lang.Thread.run(Thread.java:829) serverExceptionClass = ome.conditions.ResourceError message = Error retrieving Pixels id:2901. Pixels set does not exist or the user id:0 has insufficient permissions to retrieve it. } FAILED test/integration/test_tickets2000.py::TestTickets2000::test1184 - AssertionError: Expected the test to complete in < 3 seconds, took: 11.561142 assert 11.561142206192017 < 3.0 ==== 33 failed, 3 passed, 2025 deselected, 12 warnings in 236.22s (0:03:56) ==== !! 10/24/24 08:16:03.777 error: 4 communicators not destroyed during global destruction. Result: 1 BUILD SUCCESSFUL Total time: 4 minutes 2 seconds + /home/omero/workspace/OMERO-test-integration/src/build.py -f components/tools/OmeroJava/build.xml -Dtestng.useDefaultListeners=true -Dtestreports.dir=target/reports/broken broken OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0 Buildfile: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/build.xml Entering /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava... testng-init: :: Apache Ivy 2.4.0 - 20141213170938 :: http://ant.apache.org/ivy/ :: :: loading settings :: file = /home/omero/workspace/OMERO-test-integration/src/etc/ivysettings.xml 07:16:37,446 |-INFO in ch.qos.logback.classic.LoggerContext[default] - This is logback-classic version 1.3.14 07:16:37,449 |-INFO in ch.qos.logback.classic.util.ContextInitializer@2416c658 - No custom configurators were discovered as a service. 07:16:37,449 |-INFO in ch.qos.logback.classic.util.ContextInitializer@2416c658 - Trying to configure with ch.qos.logback.classic.joran.SerializedModelConfigurator 07:16:37,453 |-INFO in ch.qos.logback.classic.util.ContextInitializer@2416c658 - Constructed configurator of type class ch.qos.logback.classic.joran.SerializedModelConfigurator 07:16:37,453 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.scmo] 07:16:37,454 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.scmo] 07:16:37,455 |-INFO in ch.qos.logback.classic.util.ContextInitializer@2416c658 - ch.qos.logback.classic.joran.SerializedModelConfigurator.configure() call lasted 2 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 07:16:37,455 |-INFO in ch.qos.logback.classic.util.ContextInitializer@2416c658 - Trying to configure with ch.qos.logback.classic.util.DefaultJoranConfigurator 07:16:37,457 |-INFO in ch.qos.logback.classic.util.ContextInitializer@2416c658 - Constructed configurator of type class ch.qos.logback.classic.util.DefaultJoranConfigurator 07:16:37,457 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml] 07:16:37,458 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [logback.xml] at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes/logback.xml] 07:16:37,459 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@6d5f4900 - Resource [logback.xml] occurs multiple times on the classpath. 07:16:37,459 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@6d5f4900 - Resource [logback.xml] occurs at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes/logback.xml] 07:16:37,459 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@6d5f4900 - Resource [logback.xml] occurs at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/resources/logback.xml] 07:16:37,801 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [stderr] 07:16:37,801 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 07:16:37,820 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 07:16:37,822 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@1e40fbb3 - As of version 1.2.0 "immediateFlush" property should be set within the enclosing Appender. 07:16:37,822 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@1e40fbb3 - Please move "immediateFlush" property into the enclosing appender. 07:16:37,886 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@1e40fbb3 - Setting the "immediateFlush" property of the enclosing appender to true 07:16:37,887 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [velocity] to ERROR 07:16:37,887 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org] to ERROR 07:16:37,887 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [ome] to ERROR 07:16:37,887 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [loci] to ERROR 07:16:37,887 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to WARN 07:16:37,888 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [stderr] to Logger[ROOT] 07:16:37,889 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@1b560eb0 - End of configuration. 07:16:37,891 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@9e02f84 - Registering current configuration as safe fallback point 07:16:37,891 |-INFO in ch.qos.logback.classic.util.ContextInitializer@2416c658 - ch.qos.logback.classic.util.DefaultJoranConfigurator.configure() call lasted 434 milliseconds. ExecutionStatus=DO_NOT_INVOKE_NEXT_IF_ANY lifecycle.test-compile: Deleting: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/_omero_build_771177678.tmp Deleting: /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/OmeroJava-test.xml :: loading settings :: file = /home/omero/workspace/OMERO-test-integration/src/etc/ivysettings.xml :: delivering :: omero#OmeroJava-test;working@bdaf9f08f5d1 :: 5.6.3-513-75ed6e6d79-ice36-ice36 :: integration :: Thu Oct 24 07:16:04 UTC 2024 delivering ivy file to /home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/OmeroJava-test.xml :: publishing :: omero#OmeroJava-test published OmeroJava-test to /home/omero/workspace/OMERO-test-integration/src/target/test-repository/OmeroJava-test-5.6.3-513-75ed6e6d79-ice36-ice36.jar published ivy to /home/omero/workspace/OMERO-test-integration/src/target/test-repository/OmeroJava-test-5.6.3-513-75ed6e6d79-ice36-ice36.xml broken: OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0 07:16:38,781 |-INFO in ch.qos.logback.classic.LoggerContext[default] - This is logback-classic version 1.3.14 07:16:38,785 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - No custom configurators were discovered as a service. 07:16:38,785 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - Trying to configure with ch.qos.logback.classic.joran.SerializedModelConfigurator 07:16:38,786 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - Constructed configurator of type class ch.qos.logback.classic.joran.SerializedModelConfigurator 07:16:38,787 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.scmo] 07:16:38,788 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.scmo] 07:16:38,789 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - ch.qos.logback.classic.joran.SerializedModelConfigurator.configure() call lasted 3 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 07:16:38,789 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - Trying to configure with ch.qos.logback.classic.util.DefaultJoranConfigurator 07:16:38,789 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - Constructed configurator of type class ch.qos.logback.classic.util.DefaultJoranConfigurator 07:16:38,790 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml] 07:16:38,791 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [logback.xml] at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes/logback.xml] 07:16:38,792 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@10e31a9a - Resource [logback.xml] occurs multiple times on the classpath. 07:16:38,792 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@10e31a9a - Resource [logback.xml] occurs at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/classes/logback.xml] 07:16:38,792 |-WARN in ch.qos.logback.classic.util.DefaultJoranConfigurator@10e31a9a - Resource [logback.xml] occurs at [file:/home/omero/workspace/OMERO-test-integration/src/components/tools/OmeroJava/target/generated/resources/logback.xml] 07:16:39,716 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [stderr] 07:16:39,716 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 07:16:39,725 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 07:16:39,726 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@131774fe - As of version 1.2.0 "immediateFlush" property should be set within the enclosing Appender. 07:16:39,726 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@131774fe - Please move "immediateFlush" property into the enclosing appender. 07:16:39,756 |-WARN in ch.qos.logback.classic.encoder.PatternLayoutEncoder@131774fe - Setting the "immediateFlush" property of the enclosing appender to true 07:16:39,757 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [velocity] to ERROR 07:16:39,757 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org] to ERROR 07:16:39,757 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [ome] to ERROR 07:16:39,757 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [loci] to ERROR 07:16:39,757 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to WARN 07:16:39,757 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [stderr] to Logger[ROOT] 07:16:39,757 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@158d2680 - End of configuration. 07:16:39,758 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@77847718 - Registering current configuration as safe fallback point 07:16:39,758 |-INFO in ch.qos.logback.classic.util.ContextInitializer@3febb011 - ch.qos.logback.classic.util.DefaultJoranConfigurator.configure() call lasted 969 milliseconds. ExecutionStatus=DO_NOT_INVOKE_NEXT_IF_ANY Oct 24, 2024 7:16:45 AM ome.system.OmeroContext prepareRefresh INFO: Refreshing ome.system.OmeroContext@9fec931: startup date [Thu Oct 24 07:16:45 UTC 2024]; root of context hierarchy Oct 24, 2024 7:16:45 AM org.springframework.beans.factory.xml.XmlBeanDefinitionReader loadBeanDefinitions INFO: Loading XML bean definitions from class path resource [ome/config.xml] =============================================== OmeroJava.integration Total tests run: 43, Passes: 2, Failures: 41, Skips: 0 =============================================== The tests failed. BUILD SUCCESSFUL Total time: 2 minutes 38 seconds + deactivate + '[' -n /opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin ']' + PATH=/opt/ice-3.6.5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin + export PATH + unset _OLD_VIRTUAL_PATH + '[' -n '' ']' + '[' -n /bin/bash -o -n '' ']' + hash -r + '[' -n '' ']' + unset VIRTUAL_ENV + '[' '!' '' = nondestructive ']' + unset -f deactivate Recording test results [Checks API] No suitable checks publisher found. Build step 'Publish JUnit test result report' changed build result to UNSTABLE TestNG Reports Processing: START Looking for TestNG results report in workspace using pattern: **/OmeroJava/target/reports/integration/*.xml Saving reports... Processing '/var/jenkins_home/jobs/OMERO-test-integration/builds/206/testng/testng-results-1.xml' Processing '/var/jenkins_home/jobs/OMERO-test-integration/builds/206/testng/testng-results.xml' 0.038625% of tests failed, which exceeded threshold of 0%. Marking build as UNSTABLE TestNG Reports Processing: FINISH Finished: UNSTABLE