diff --git a/.gitlab-ci.yml b/.gitlab-ci.yml
index 5a915ca23472b47af4e961afe2eca414347dc325..d57173df152ed13a4dc4c16f6d133c034305f581 100644
--- a/.gitlab-ci.yml
+++ b/.gitlab-ci.yml
@@ -39,14 +39,6 @@ stages:
       reports:
         junit: "reports/junit*.xml"
 
-test-py3.6:
-    image: docker.km3net.de/base/python:3.6
-    stage: test
-    script:
-        - *virtualenv_definition
-        - make test
-    <<: *junit_definition
-
 test-py3.7:
     image: docker.km3net.de/base/python:3.7
     stage: test
@@ -71,6 +63,30 @@ test-py3.9:
         - make test
     <<: *junit_definition
 
+test-py3.10:
+    image: docker.km3net.de/base/python:3.10
+    stage: test
+    script:
+        - *virtualenv_definition
+        - make test
+    <<: *junit_definition
+
+test-py3.11:
+    image: docker.km3net.de/base/python:3.11
+    stage: test
+    script:
+        - *virtualenv_definition
+        - make test
+    <<: *junit_definition
+
+test-py3.12:
+    image: git.km3net.de:4567/common/dockerfiles/base/python:3.12
+    stage: test
+    script:
+        - *virtualenv_definition
+        - make test
+    <<: *junit_definition
+
 code-style:
     image: docker.km3net.de/base/python:3.9
     stage: test
diff --git a/CHANGELOG.rst b/CHANGELOG.rst
index 445dd2487b203eda91d692708ec1283fe9df9f10..1dcab37e0c572defbbf06ca398b2fb00b1058dad 100644
--- a/CHANGELOG.rst
+++ b/CHANGELOG.rst
@@ -4,6 +4,13 @@ Unreleased changes
 
 Version 1
 ---------
+1.2.0 / 2024-06-24
+~~~~~~~~~~~~~~~~~~
+* Removed online format support (online events, timeslices and summary slices) in favour of
+  the `KM3io.jl <https://git.km3net.de/common/KM3io.jl>`__` Julia Package.
+* uproot 5 and awkward 2 are now required
+* Python 3.7+ required
+
 1.1.0 / 2024-03-14
 ~~~~~~~~~~~~~~~~~~
 * A few astro helpers were added: azimuth(), zenith(), phi(), theta(), ...
diff --git a/README.rst b/README.rst
index 2ba082d14fc7d93ce1d2366f18774c1f2060634a..5ec5fc868c77aa0b2dcb1e4908af59d77715aab7 100644
--- a/README.rst
+++ b/README.rst
@@ -58,11 +58,21 @@ If you have a question about km3io, please proceed as follows:
 Introduction
 ------------
 
-Most of km3net data is stored in root files. These root files are created using the `KM3NeT Dataformat library <https://git.km3net.de/common/km3net-dataformat>`__
-A ROOT file created with
-`Jpp <https://git.km3net.de/common/jpp>`__ is an "online" file and all other software usually produces "offline" files.
-
-km3io is a Python package that provides a set of classes: ``OnlineReader``, ``OfflineReader`` and a special class to read gSeaGen files. All of these ROOT files can be read installing any other software like Jpp, aanet or ROOT.
+Most of km3net data is stored in root files. These root files are created using
+the `KM3NeT Dataformat library
+<https://git.km3net.de/common/km3net-dataformat>`__ A ROOT file created with
+`Jpp <https://git.km3net.de/common/jpp>`__ is an "online" file and all other
+software usually produces "offline" files.
+
+km3io is a Python package that provides access to offline files with its
+``OfflineReader`` class and a special one to read gSeaGen files. All of these
+ROOT files can be read without installing any other software like Jpp, aanet or
+ROOT. km3io v1.1 and earlier also support the access to online files (events,
+summaryslices and timeslices). This feature has been dropped due to a lack of
+mainteinance power and inf favour of the `KM3io.jl <https://git.km3net.de/common/KM3io.jl>`__` Julia Package, which
+provides high-performances access to all ROOT files and should also be
+prioritised over ``km3io`` when performance matters (which does, most of the
+time).
 
 Data in km3io is returned as ``awkward.Array`` which is an advance Numpy-like container type to store
 contiguous data for high performance computations.
@@ -254,100 +264,4 @@ to retrieve the energy of the very first reconstructed track for the first three
 Online files reader
 -------------------
 
-``km3io`` is able to read events, summary slices and timeslices. Timeslices are
-currently only supported with split level of 2 or more, which means that reading
-L0 timeslices is not working at the moment (but is in progress).
-
-Let's have a look at some online data.
-
-Reading online events
-"""""""""""""""""""""
-
-Now we use the ``OnlineReader`` to create our file object.
-
-.. code-block:: python3
-
-  import km3io
-  f = km3io.OnlineReader(data_path("online/km3net_online.root"))
-
-
-That's it, we created an object which gives access to all the events, but the
-relevant data is still not loaded into the memory (lazy access)!
-The structure is different compared to the ``OfflineReader``
-because online files contain additional branches at the top level
-(summaryslices and timeslices).
-
-.. code-block:: python3
-
-  >>> f.events
-  Number of events: 3
-  >>> f.events.snapshot_hits[1].tot[:10]
-  array([27, 24, 21, 17, 22, 15, 24, 30, 19, 15], dtype=uint8)
-  >>> f.events.triggered_hits[1].channel_id[:10]
-  array([ 2,  3, 16, 22, 23,  0,  2,  3,  4,  5], dtype=uint8)
-
-The resulting arrays are numpy arrays. The indexing convention is: the first indexing
-corresponds to the event, the second to the branch and consecutive ones to the
-optional dimensions of the arrays. In the last step we accessed the PMT channel IDs
-of the first 10 hits of the second event.
-
-Reading SummarySlices
-"""""""""""""""""""""
-
-The following example shows how to access summary slices. The summary slices are
-returned in chunks to be more efficient with the I/O. The default chunk-size is
-1000. In the example file we only have three summaryslices, so there is only a single
-chunk. The first index passed to the summaryslices reader is corresponding to the
-chunk and the second to the index of the summaryslice in that chunk.
-
-.. code-block:: python3
-
-  >>> f.summaryslices
-  <SummarysliceReader 3 items, step_size=1000 (1 chunk)>
-  >>> f.summaryslices[0]
-  SummarysliceChunk(headers=<Array [{' cnt': 671088704, ... ] type='3 * {" cnt": uint32, " vers": uint16, " ...'>, slices=<Array [[{dom_id: 806451572, ... ch30: 48}]] type='3 * var * {"dom_id": int32, "...'>)
-  >>> f.summaryslices[0].headers
-  <Array [{' cnt': 671088704, ... ] type='3 * {" cnt": uint32, " vers": uint16, " ...'>
-  >>> f.summaryslices[0].slices[2]
-  <Array [{dom_id: 806451572, ... ch30: 48}] type='68 * {"dom_id": int32, "dq_stat...'>
-  >>> f.summaryslices[0].slices[2].dom_id
-  <Array [806451572, 806455814, ... 809544061] type='68 * int32'>
-  >>> f.summaryslices[0].slices[2].ch23
-  <Array [48, 43, 46, 54, 83, ... 51, 51, 52, 50] type='68 * uint8'>
-
-Reading Timeslices
-""""""""""""""""""
-
-Timeslices are split into different streams since 2017 and ``km3io`` currently
-supports everything except L0, i.e. L1, L2 and SN streams. The API is
-work-in-progress and will be improved in future, however, all the data is
-already accessible (although in ugly ways ;-)
-
-To access the timeslice data, you need to specify which timeslice stream
-to read:
-
-.. code-block:: python3
-
-  >>> f.timeslices
-  Available timeslice streams: SN, L1
-  >>> f.timeslices.stream("L1", 0).frames
-  {806451572: <Table [<Row 0> <Row 1> <Row 2> ... <Row 981> <Row 982> <Row 983>] at 0x00014c167340>,
-  806455814: <Table [<Row 984> <Row 985> <Row 986> ... <Row 1985> <Row 1986> <Row 1987>] at 0x00014c5f4760>,
-  806465101: <Table [<Row 1988> <Row 1989> <Row 1990> ... <Row 2236> <Row 2237> <Row 2238>] at 0x00014c5f45e0>,
-  806483369: <Table [<Row 2239> <Row 2240> <Row 2241> ... <Row 2965> <Row 2966> <Row 2967>] at 0x00014c12b910>,
-  ...
-  809544061: <Table [<Row 48517> <Row 48518> <Row 48519> ... <Row 49240> <Row 49241> <Row 49242>] at 0x00014ca57100>}
-
-The frames are represented by a dictionary where the key is the ``DOM ID`` and
-the value an awkward array of hits, with the usual fields to access the PMT
-channel, time and ToT:
-
-.. code-block:: python3
-
-   >>> f.timeslices.stream("L1", 0).frames[809524432].dtype
-   dtype([('pmt', 'u1'), ('tdc', '<u4'), ('tot', 'u1')])
-   >>> f.timeslices.stream("L1", 0).frames[809524432].tot
-  array([25, 27, 28, ..., 29, 22, 28], dtype=uint8)
-
-
-
+The support to read online ROOT files has been dropped in ``km3io`` v1.2.
diff --git a/examples/README.rst b/examples/README.rst
index 31ecdcf181b5766c08e807ca97aaa3f43738ec78..4630ecf7150eae91316111ee4fde24741633155f 100644
--- a/examples/README.rst
+++ b/examples/README.rst
@@ -1,4 +1,4 @@
-Reading online and offline data
-===============================
+Reading offline data
+====================
 
 Feel free to explore and extend our examples!
diff --git a/examples/plot_online_example.py b/examples/plot_online_example.py
deleted file mode 100644
index ade4f4f60dd1cf4e644f09b61e4e8927d27eeab0..0000000000000000000000000000000000000000
--- a/examples/plot_online_example.py
+++ /dev/null
@@ -1,78 +0,0 @@
-"""
-Reading Online Data
-===================
-
-The following example shows how to access hits in a ROOT file which is coming
-from the detector and written by the `JDataWriter` application.
-
-Such a file is usually called "KM3NET_00000001_00000002.root", where the first
-number is the detector ID and the second the run number.
-"""
-
-import km3io as ki
-from km3net_testdata import data_path
-
-#####################################################
-# Accessing the event tree
-# ------------------------
-# Just pass a filename to the reader class and get access to the event tree
-# with:
-
-
-f = ki.OnlineReader(data_path("online/km3net_online.root"))
-
-#####################################################
-# Note that only some meta information is read into memory.
-#
-# Printing it will simply tell you how many events it has found. Again, nothing
-# else is read yet:
-
-print(f.events)
-
-#####################################################
-# Now let's look at the hits data:
-
-print(f.events[0].snapshot_hits.tot)
-
-#####################################################
-# the resulting arrays are numpy arrays.
-
-#####################################################
-# Reading SummarySlices
-# ---------------------
-# The following example shows how to access summary slices, in particular the DOM
-# IDs of the slice with the index 0.
-# The current implementation of the summaryslice I/O uses a chunked reading for
-# better performance, which means that when you iterate through the `.slices`,
-# you'll get chunks of summaryslices in each iteration instead of a single one.
-#
-# In the example below, we simulate a single iteration by using the `break`
-# keyword and then use the data which has been "pulled out" of the ROOT file.
-
-
-for chunk in f.summaryslices:
-    break
-
-#####################################################
-# `chunk` now contains the first set of summaryslices so `chunk.slice[0]` refers
-# to the first summaryslice in the ROOT file. To access e.g. the DOM IDs, use
-# the `.dom_id` attribute
-
-dom_ids = chunk.slices[0].dom_id
-
-print(dom_ids)
-
-#####################################################
-# The .type attribute (or in general, <TAB> completion) is useful to find out
-# more about the field structure:
-
-print(chunk.slices.type)
-
-#####################################################
-# Similar to the summaryslice data, the headers can be accessed the same way
-# To read the frame index of all summaryslices in the obtained chunk:
-
-print(chunk.headers.frame_index)
-
-#####################################################
-# To be continued...
diff --git a/setup.cfg b/setup.cfg
index bcdf6b023217db9422d63f81f108e71c050ec9e9..4d299de5f4165272cf7bdeba859db311b9f89277 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -19,7 +19,6 @@ classifiers =
     Programming Language :: Python
     Programming Language :: Python :: 3
     Programming Language :: Python :: 3 :: Only
-    Programming Language :: Python :: 3.6
     Programming Language :: Python :: 3.7
     Programming Language :: Python :: 3.8
     Programming Language :: Python :: 3.9
@@ -37,12 +36,10 @@ packages = find:
 install_requires =
     docopt
     numba>=0.50
-    awkward>=1.9
-    awkward0>=0.15.5
-    uproot3>=3.11.1
-    uproot>=4.2.2
+    awkward>=2
+    uproot>=5
     setuptools_scm
-python_requires = >=3.6
+python_requires = >=3.7
 include_package_data = True
 package_dir =
     =src
@@ -73,10 +70,6 @@ dev =
     sphinxcontrib-versioning
     wheel
 
-[options.entry_points]
-console_scripts =
-    KPrintTree = km3io.utils.kprinttree:main
-
 [options.package_data]
 * = *.mplstyle, *.py.typed
 
diff --git a/src/km3io/__init__.py b/src/km3io/__init__.py
index 87319be5305dadc56ea53cb79e7b9e3b99c948b8..4ea93667f383a98003df755d22a8df2da793ad6c 100644
--- a/src/km3io/__init__.py
+++ b/src/km3io/__init__.py
@@ -16,11 +16,5 @@ import os
 # This needs to be done before import numpy
 os.environ["KMP_WARNINGS"] = "off"
 
-with warnings.catch_warnings():
-    for warning_category in (FutureWarning, DeprecationWarning):
-        warnings.simplefilter("ignore", category=warning_category)
-    import uproot3
-
 from .offline import OfflineReader
-from .online import OnlineReader
 from .acoustics import RawAcousticReader
diff --git a/src/km3io/online.py b/src/km3io/online.py
deleted file mode 100644
index cc8439f8359233aff73053628ed9a12b3e6276ad..0000000000000000000000000000000000000000
--- a/src/km3io/online.py
+++ /dev/null
@@ -1,488 +0,0 @@
-import binascii
-from collections import namedtuple
-import os
-import uproot
-import uproot3
-import numpy as np
-
-import numba as nb
-
-TIMESLICE_FRAME_BASKET_CACHE_SIZE = 523 * 1024**2  # [byte]
-SUMMARYSLICE_FRAME_BASKET_CACHE_SIZE = 523 * 1024**2  # [byte]
-BASKET_CACHE_SIZE = 110 * 1024**2
-BASKET_CACHE = uproot3.cache.ThreadSafeArrayCache(BASKET_CACHE_SIZE)
-
-# Parameters for PMT rate conversions, since the rates in summary slices are
-# stored as a single byte to save space. The values from 0-255 can be decoded
-# using the `get_rate(value)` function, which will yield the actual rate
-# in Hz.
-MINIMAL_RATE_HZ = 2.0e3
-MAXIMAL_RATE_HZ = 2.0e6
-RATE_FACTOR = np.log(MAXIMAL_RATE_HZ / MINIMAL_RATE_HZ) / 255
-
-CHANNEL_BITS_TEMPLATE = np.zeros(31, dtype=bool)
-
-
-BranchConfiguration = namedtuple(
-    field_names=["branch_address", "interpretation"], typename="BranchConfiguration"
-)
-
-
-class SummarysliceReader:
-    """
-    A reader for summaryslices which are loaded as chunks given by step_size.
-
-    To be used as an iterator (`for chunks in SummarysliceReader(...): ...`)
-    """
-
-    TREE_ADDR = "KM3NET_SUMMARYSLICE/KM3NET_SUMMARYSLICE"
-    _subbranches = [
-        BranchConfiguration(
-            "KM3NETDAQ::JDAQSummarysliceHeader",
-            uproot.interpretation.numerical.AsDtype(
-                [
-                    (" cnt", "u4"),
-                    (" vers", "u2"),
-                    (" cnt2", "u4"),
-                    (" vers2", "u2"),
-                    (" cnt3", "u4"),
-                    (" vers3", "u2"),
-                    ("detector_id", ">i4"),
-                    ("run", ">i4"),
-                    ("frame_index", ">i4"),
-                    (" cnt4", "u4"),
-                    (" vers4", "u2"),
-                    ("UTC_seconds", ">u4"),
-                    ("UTC_16nanosecondcycles", ">u4"),
-                ]
-            ),
-        ),
-        BranchConfiguration(
-            "vector<KM3NETDAQ::JDAQSummaryFrame>",
-            uproot.interpretation.jagged.AsJagged(
-                uproot.interpretation.numerical.AsDtype(
-                    [
-                        ("dom_id", ">i4"),
-                        ("dq_status", ">u4"),
-                        ("hrv", ">u4"),
-                        ("fifo", ">u4"),
-                        ("status3", ">u4"),
-                        ("status4", ">u4"),
-                    ]
-                    + [(f"ch{c}", "u1") for c in range(31)]
-                ),
-                header_bytes=10,
-            ),
-        ),
-    ]
-
-    def __init__(self, fobj, step_size=1000):
-        if isinstance(fobj, str):
-            self._fobj = uproot.open(fobj)
-        else:
-            self._fobj = fobj
-        self._step_size = step_size
-        self._branch = self._fobj[self.TREE_ADDR]
-
-        self.ChunksConstructor = namedtuple(
-            field_names=["headers", "slices"], typename="SummarysliceChunk"
-        )
-
-    def _chunks_generator(self):
-        for chunk in self._branch.iterate(
-            dict(self._subbranches), step_size=self._step_size
-        ):
-            yield self.ChunksConstructor(
-                *[getattr(chunk, bc.branch_address) for bc in self._subbranches]
-            )
-
-    def __getitem__(self, idx):
-        if idx >= len(self) or idx < -len(self):
-            raise IndexError("Chunk index out of range")
-
-        s = self._step_size
-
-        if idx < 0:
-            idx = len(self) + idx
-
-        chunk = self._branch.arrays(
-            dict(self._subbranches), entry_start=idx * s, entry_stop=(idx + 1) * s
-        )
-        return self.ChunksConstructor(
-            *[getattr(chunk, bc.branch_address) for bc in self._subbranches]
-        )
-
-    def __iter__(self):
-        self._chunks = self._chunks_generator()
-        return self
-
-    def __next__(self):
-        return next(self._chunks)
-
-    def __len__(self):
-        return int(np.ceil(self._branch.num_entries / self._step_size))
-
-    def __repr__(self):
-        step_size = self._step_size
-        n_items = self._branch.num_entries
-        cls_name = self.__class__.__name__
-        n_chunks = len(self)
-        return (
-            f"<{cls_name} {n_items} items, step_size={step_size} "
-            f"({n_chunks} chunk{'' if n_chunks == 1 else 's'})>"
-        )
-
-
-@nb.vectorize(
-    [nb.int32(nb.int8), nb.int32(nb.int16), nb.int32(nb.int32), nb.int32(nb.int64)]
-)
-def get_rate(value):  # pragma: no cover
-    """Return the rate in Hz from the short int value"""
-    if value == 0:
-        return 0
-    else:
-        return MINIMAL_RATE_HZ * np.exp(value * RATE_FACTOR)
-
-
-@nb.guvectorize(
-    "void(i8, b1[:], b1[:])", "(), (n) -> (n)", target="parallel", nopython=True
-)
-def unpack_bits(value, bits_template, out):  # pragma: no cover
-    """Return a boolean array for a value's bit representation.
-
-    This function also accepts arrays as input, the output shape will be
-    NxM where N is the number of input values and M the length of the
-    ``bits_template`` array, which is just a dummy array, due to the weird
-    signature system of numba.
-
-    Parameters
-    ----------
-    value: int or np.array(int) with shape (N,)
-        The binary value of containing the bit information
-    bits_template: np.array() with shape (M,)
-        The template for the output array, the only important is its shape
-
-    Returns
-    -------
-    np.array(bool) either with shape (M,) or (N, M)
-    """
-    for i in range(bits_template.shape[0]):
-        out[30 - i] = value & (1 << i) > 0
-
-
-def get_channel_flags(value):
-    """Returns the hrv/fifo flags for the PMT channels (hrv/fifo)
-
-    Parameters
-    ----------
-    value : int32
-        The integer value to be parsed.
-    """
-    channel_bits = np.bitwise_and(value, 0x7FFFFFFF)
-    flags = unpack_bits(channel_bits, CHANNEL_BITS_TEMPLATE)
-    return np.flip(flags, axis=-1)
-
-
-def get_number_udp_packets(value):
-    """Returns the number of received UDP packets (dq_status)
-
-    Parameters
-    ----------
-    value : int32
-        The integer value to be parsed.
-    """
-    return np.bitwise_and(value, 0x7FFF)
-
-
-def get_udp_max_sequence_number(value):
-    """Returns the maximum sequence number of the received UDP packets (dq_status)
-
-    Parameters
-    ----------
-    value : int32
-        The integer value to be parsed.
-    """
-    return np.right_shift(value, 16)
-
-
-def has_udp_trailer(value):
-    """Returns the UDP Trailer flag (fifo)
-
-    Parameters
-    ----------
-    value : int32
-        The integer value to be parsed.
-    """
-    return np.any(np.bitwise_and(value, np.left_shift(1, 31)))
-
-
-class OnlineReader:
-    """Reader for online ROOT files"""
-
-    def __init__(self, filename):
-        self._fobj = uproot3.open(filename)
-        self._filename = filename
-        self._events = None
-        self._timeslices = None
-        self._summaryslices = None
-        self._uuid = binascii.hexlify(self._fobj._context.uuid).decode("ascii")
-
-    @property
-    def uuid(self):
-        return self._uuid
-
-    def close(self):
-        self._fobj.close()
-
-    def __enter__(self):
-        return self
-
-    def __exit__(self, *args):
-        self.close()
-
-    @property
-    def events(self):
-        if self._events is None:
-            tree = self._fobj["KM3NET_EVENT"]
-
-            headers = tree["KM3NETDAQ::JDAQEventHeader"].array(
-                uproot3.interpret(tree["KM3NETDAQ::JDAQEventHeader"], cntvers=True)
-            )
-            snapshot_hits = tree["snapshotHits"].array(
-                uproot3.asjagged(
-                    uproot3.astable(
-                        uproot3.asdtype(
-                            [
-                                ("dom_id", ">i4"),
-                                ("channel_id", "u1"),
-                                ("time", "<u4"),
-                                ("tot", "u1"),
-                            ]
-                        )
-                    ),
-                    skipbytes=10,
-                )
-            )
-            triggered_hits = tree["triggeredHits"].array(
-                uproot3.asjagged(
-                    uproot3.astable(
-                        uproot3.asdtype(
-                            [
-                                ("dom_id", ">i4"),
-                                ("channel_id", "u1"),
-                                ("time", "<u4"),
-                                ("tot", "u1"),
-                                (" cnt", "u4"),
-                                (" vers", "u2"),
-                                ("trigger_mask", ">u8"),
-                            ]
-                        )
-                    ),
-                    skipbytes=10,
-                )
-            )
-            self._events = OnlineEvents(headers, snapshot_hits, triggered_hits)
-        return self._events
-
-    @property
-    def timeslices(self):
-        if self._timeslices is None:
-            self._timeslices = Timeslices(self._fobj)
-        return self._timeslices
-
-    @property
-    def summaryslices(self):
-        if self._summaryslices is None:
-            self._summaryslices = SummarysliceReader(
-                uproot.open(self._filename)
-            )  # TODO: remove when using uproot4
-        return self._summaryslices
-
-
-class Timeslices:
-    """A simple wrapper for timeslices"""
-
-    def __init__(self, fobj):
-        self._fobj = fobj
-        self._timeslices = {}
-        self._read_streams()
-
-    def _read_streams(self):
-        """Read the L0, L1, L2 and SN streams if available"""
-        streams = set(
-            s.split(b"KM3NET_TIMESLICE_")[1].split(b";")[0]
-            for s in self._fobj.keys()
-            if b"KM3NET_TIMESLICE_" in s
-        )
-        for stream in streams:
-            tree = self._fobj[b"KM3NET_TIMESLICE_" + stream][
-                b"KM3NETDAQ::JDAQTimeslice"
-            ]
-            headers = tree[b"KM3NETDAQ::JDAQTimesliceHeader"][b"KM3NETDAQ::JDAQHeader"][
-                b"KM3NETDAQ::JDAQChronometer"
-            ]
-            if len(headers) == 0:
-                continue
-            superframes = tree[b"vector<KM3NETDAQ::JDAQSuperFrame>"]
-            hits_dtype = np.dtype([("pmt", "u1"), ("tdc", "<u4"), ("tot", "u1")])
-            hits_buffer = superframes[
-                b"vector<KM3NETDAQ::JDAQSuperFrame>.buffer"
-            ].lazyarray(
-                uproot3.asjagged(
-                    uproot3.astable(uproot3.asdtype(hits_dtype)), skipbytes=6
-                ),
-                basketcache=uproot3.cache.ThreadSafeArrayCache(
-                    TIMESLICE_FRAME_BASKET_CACHE_SIZE
-                ),
-            )
-            self._timeslices[stream.decode("ascii")] = (
-                headers,
-                superframes,
-                hits_buffer,
-            )
-            setattr(
-                self,
-                stream.decode("ascii"),
-                TimesliceStream(headers, superframes, hits_buffer),
-            )
-
-    def stream(self, stream, idx):
-        ts = self._timeslices[stream]
-        return Timeslice(*ts, idx, stream)
-
-    def __str__(self):
-        return "Available timeslice streams: {}".format(
-            ", ".join(s for s in self._timeslices.keys())
-        )
-
-    def __repr__(self):
-        return str(self)
-
-
-class TimesliceStream:
-    def __init__(self, headers, superframes, hits_buffer):
-        # self.headers = headers.lazyarray(
-        #     uproot3.asjagged(uproot3.astable(
-        #         uproot3.asdtype(
-        #             np.dtype([('a', 'i4'), ('b', 'i4'), ('c', 'i4'),
-        #                       ('d', 'i4'), ('e', 'i4')]))),
-        #                     skipbytes=6),
-        #     basketcache=uproot3.cache.ThreadSafeArrayCache(
-        #         TIMESLICE_FRAME_BASKET_CACHE_SIZE))
-        self.headers = headers
-        self.superframes = superframes
-        self._hits_buffer = hits_buffer
-
-    # def frames(self):
-    #     n_hits = self._superframe[
-    #         b'vector<KM3NETDAQ::JDAQSuperFrame>.numberOfHits'].lazyarray(
-    #             basketcache=BASKET_CACHE)[self._idx]
-    #     module_ids = self._superframe[
-    #         b'vector<KM3NETDAQ::JDAQSuperFrame>.id'].lazyarray(basketcache=BASKET_CACHE)[self._idx]
-    #     idx = 0
-    #     for module_id, n_hits in zip(module_ids, n_hits):
-    #         self._frames[module_id] = hits_buffer[idx:idx + n_hits]
-    #         idx += n_hits
-
-
-class Timeslice:
-    """A wrapper for a timeslice"""
-
-    def __init__(self, header, superframe, hits_buffer, idx, stream):
-        self.header = header
-        self._frames = {}
-        self._superframe = superframe
-        self._hits_buffer = hits_buffer
-        self._idx = idx
-        self._stream = stream
-        self._n_frames = None
-
-    @property
-    def frames(self):
-        if not self._frames:
-            self._read_frames()
-        return self._frames
-
-    def _read_frames(self):
-        """Populate a dictionary of frames with the module ID as key"""
-        hits_buffer = self._hits_buffer[self._idx]
-        n_hits = self._superframe[
-            b"vector<KM3NETDAQ::JDAQSuperFrame>.numberOfHits"
-        ].lazyarray(basketcache=BASKET_CACHE)[self._idx]
-        try:
-            module_ids = self._superframe[
-                b"vector<KM3NETDAQ::JDAQSuperFrame>.id"
-            ].lazyarray(basketcache=BASKET_CACHE)[self._idx]
-        except KeyError:
-            module_ids = (
-                self._superframe[
-                    b"vector<KM3NETDAQ::JDAQSuperFrame>.KM3NETDAQ::JDAQModuleIdentifier"
-                ]
-                .lazyarray(
-                    uproot3.asjagged(
-                        uproot3.astable(uproot3.asdtype([("dom_id", ">i4")]))
-                    ),
-                    basketcache=BASKET_CACHE,
-                )[self._idx]
-                .dom_id
-            )
-
-        idx = 0
-        for module_id, n_hits in zip(module_ids, n_hits):
-            self._frames[module_id] = hits_buffer[idx : idx + n_hits]
-            idx += n_hits
-
-    def __len__(self):
-        if self._n_frames is None:
-            self._n_frames = len(
-                self._superframe[b"vector<KM3NETDAQ::JDAQSuperFrame>.id"].lazyarray(
-                    basketcache=BASKET_CACHE
-                )[self._idx]
-            )
-        return self._n_frames
-
-    def __str__(self):
-        return "{} timeslice with {} frames.".format(self._stream, len(self))
-
-    def __repr__(self):
-        return "<{}: {} entries>".format(self.__class__.__name__, len(self.header))
-
-
-class OnlineEvents:
-    """A simple wrapper for online events"""
-
-    def __init__(self, headers, snapshot_hits, triggered_hits):
-        self.headers = headers
-        self.snapshot_hits = snapshot_hits
-        self.triggered_hits = triggered_hits
-
-    def __getitem__(self, item):
-        return OnlineEvent(
-            self.headers[item], self.snapshot_hits[item], self.triggered_hits[item]
-        )
-
-    def __len__(self):
-        return len(self.headers)
-
-    def __str__(self):
-        return "Number of events: {}".format(len(self.headers))
-
-    def __repr__(self):
-        return str(self)
-
-
-class OnlineEvent:
-    """A wrapper for a online event"""
-
-    def __init__(self, header, snapshot_hits, triggered_hits):
-        self.header = header
-        self.snapshot_hits = snapshot_hits
-        self.triggered_hits = triggered_hits
-
-    def __str__(self):
-        return "Online event with {} snapshot hits and {} triggered hits".format(
-            len(self.snapshot_hits), len(self.triggered_hits)
-        )
-
-    def __repr__(self):
-        return str(self)
diff --git a/src/km3io/rootio.py b/src/km3io/rootio.py
index 6133fca4607749a3d7f46a7680722feaeb59e304..5f83fd1255f0004ee80a43921b2b83c10ca5cf8b 100644
--- a/src/km3io/rootio.py
+++ b/src/km3io/rootio.py
@@ -60,7 +60,7 @@ class EventReader:
         else:
             raise TypeError("Unsupported file descriptor.")
         self._step_size = step_size
-        self._uuid = self._fobj._file.uuid
+        self._uuid = self._fobj.parent.uuid
         self._iterator_index = 0
         self._keys = keys
         self._event_ctor = event_ctor
diff --git a/src/km3io/tools.py b/src/km3io/tools.py
index 943a875184bb11ea1cfa6f3b7e35ead29226993b..b776a917df2daedcbdf3e0e9209c209d8605387e 100644
--- a/src/km3io/tools.py
+++ b/src/km3io/tools.py
@@ -3,7 +3,6 @@ from collections import namedtuple
 import numba as nb
 import numpy as np
 import awkward as ak
-import uproot3
 
 import km3io.definitions
 from km3io.definitions import reconstruction as krec
@@ -12,10 +11,6 @@ from km3io.definitions import fitparameters as kfit
 from km3io.definitions import w2list_genhen as kw2gen
 from km3io.definitions import w2list_gseagen as kw2gsg
 
-# 110 MB based on the size of the largest basket found so far in km3net
-BASKET_CACHE_SIZE = 110 * 1024**2
-BASKET_CACHE = uproot3.cache.ThreadSafeArrayCache(BASKET_CACHE_SIZE)
-
 
 class cached_property:
     """A simple cache decorator for properties."""
@@ -312,41 +307,27 @@ def mask(arr, sequence=None, startend=None, minmax=None, atleast=None):
             elif atleast is not None:
                 np_array = _mask_atleast(ak.Array(layout), np.array(atleast))
 
-            return ak.layout.NumpyArray(np_array)
+            return ak.contents.NumpyArray(np_array)
 
-        elif isinstance(
-            layout,
-            (
-                ak.layout.ListArray32,
-                ak.layout.ListArrayU32,
-                ak.layout.ListArray64,
-            ),
-        ):
+        elif isinstance(layout, ak.contents.ListArray):
             if len(layout.stops) == 0:
                 content = recurse(layout.content)
             else:
                 content = recurse(layout.content[: np.max(layout.stops)])
             return type(layout)(layout.starts, layout.stops, content)
 
-        elif isinstance(
-            layout,
-            (
-                ak.layout.ListOffsetArray32,
-                ak.layout.ListOffsetArrayU32,
-                ak.layout.ListOffsetArray64,
-            ),
-        ):
+        elif isinstance(layout, ak.contents.ListOffsetArray):
             content = recurse(layout.content[: layout.offsets[-1]])
             return type(layout)(layout.offsets, content)
 
-        elif isinstance(layout, ak.layout.RegularArray):
+        elif isinstance(layout, ak.contents.RegularArray):
             content = recurse(layout.content)
-            return ak.layout.RegularArray(content, layout.size)
+            return ak.contents.RegularArray(content, layout.size)
 
         else:
             raise NotImplementedError(repr(arr))
 
-    layout = ak.to_layout(arr, allow_record=True, allow_other=False)
+    layout = ak.to_layout(arr, allow_record=True)
     return ak.Array(recurse(layout))
 
 
diff --git a/src/km3io/utils/kprinttree.py b/src/km3io/utils/kprinttree.py
deleted file mode 100644
index 6a867b153813c7f8f7b2cda10a4630ab43d59010..0000000000000000000000000000000000000000
--- a/src/km3io/utils/kprinttree.py
+++ /dev/null
@@ -1,45 +0,0 @@
-#!/usr/bin/env python
-# coding=utf-8
-# Filename: kprinttree.py
-# Author: Tamas Gal <tgal@km3net.de>
-"""
-Print the available ROOT trees.
-
-Usage:
-    KPrintTree -f FILENAME
-    KPrintTree (-h | --help)
-
-Options:
-    -f FILENAME  The file to print (;
-    -h --help    Show this screen.
-
-"""
-import warnings
-
-with warnings.catch_warnings():
-    for warning_category in (FutureWarning, DeprecationWarning):
-        warnings.simplefilter("ignore", category=warning_category)
-    import uproot3
-
-
-def print_tree(filename):
-    f = uproot3.open(filename)
-    for key in f.keys():
-        try:
-            print("{:<30} : {:>9} items".format(key.decode(), len(f[key])))
-        except (TypeError, KeyError):
-            print("{}".format(key.decode()))
-        except NotImplementedError:
-            print("{} (TStreamerSTL)".format(key.decode()))
-
-
-def main():
-    from docopt import docopt
-
-    args = docopt(__doc__)
-
-    print_tree(args["-f"])
-
-
-if __name__ == "__main__":
-    main()
diff --git a/tests/test_online.py b/tests/test_online.py
deleted file mode 100644
index 8684f6c41e4e97e1bb71d1801956372d5210f615..0000000000000000000000000000000000000000
--- a/tests/test_online.py
+++ /dev/null
@@ -1,831 +0,0 @@
-from collections import namedtuple
-import itertools
-import os
-import re
-import unittest
-import numpy as np
-
-from km3net_testdata import data_path
-
-from km3io.online import (
-    OnlineReader,
-    SummarysliceReader,
-    get_rate,
-    has_udp_trailer,
-    get_udp_max_sequence_number,
-    get_channel_flags,
-    get_number_udp_packets,
-)
-from km3io.tools import to_num
-
-ONLINE_FILE = data_path("online/km3net_online.root")
-
-
-class TestOnlineReaderContextManager(unittest.TestCase):
-    def test_context_manager(self):
-        with OnlineReader(ONLINE_FILE) as r:
-            assert r._filename == ONLINE_FILE
-
-
-class TestUUID(unittest.TestCase):
-    def test_uuid(self):
-        assert OnlineReader(ONLINE_FILE).uuid == "00010c85603008c611ea971772f09e86beef"
-
-
-class TestOnlineEvents(unittest.TestCase):
-    def setUp(self):
-        self.events = OnlineReader(ONLINE_FILE).events
-
-    def test_index_lookup(self):
-        assert 3 == len(self.events)
-
-    def test_str(self):
-        assert re.match(".*events.*3", str(self.events))
-
-    def test_repr(self):
-        assert re.match(".*events.*3", self.events.__repr__())
-
-
-class TestOnlineEvent(unittest.TestCase):
-    def setUp(self):
-        self.event = OnlineReader(ONLINE_FILE).events[0]
-
-    def test_str(self):
-        assert re.match(".*event.*96.*snapshot.*18.*triggered", str(self.event))
-
-    def test_repr(self):
-        assert re.match(".*event.*96.*snapshot.*18.*triggered", self.event.__repr__())
-
-
-class TestOnlineEventsSnapshotHits(unittest.TestCase):
-    def setUp(self):
-        self.events = OnlineReader(ONLINE_FILE).events
-        self.lengths = {0: 96, 1: 124, -1: 78}
-        self.total_item_count = 298
-
-    def test_reading_snapshot_hits(self):
-        hits = self.events.snapshot_hits
-
-        for event_id, length in self.lengths.items():
-            assert length == len(hits[event_id].dom_id)
-            assert length == len(hits[event_id].channel_id)
-            assert length == len(hits[event_id].time)
-
-    def test_total_item_counts(self):
-        hits = self.events.snapshot_hits
-
-        assert self.total_item_count == sum(hits.dom_id.count())
-        assert self.total_item_count == sum(hits.channel_id.count())
-        assert self.total_item_count == sum(hits.time.count())
-
-    def test_data_values(self):
-        hits = self.events.snapshot_hits
-
-        self.assertListEqual(
-            [806451572, 806451572, 806455814], list(hits.dom_id[0][:3])
-        )
-        self.assertListEqual([10, 13, 0], list(hits.channel_id[0][:3]))
-        self.assertListEqual([30733918, 30733916, 30733256], list(hits.time[0][:3]))
-
-    def test_channel_ids_have_valid_values(self):
-        hits = self.events.snapshot_hits
-
-        # channel IDs are always between [0, 30]
-        assert all(c >= 0 for c in hits.channel_id.min())
-        assert all(c < 31 for c in hits.channel_id.max())
-
-
-class TestOnlineEventsTriggeredHits(unittest.TestCase):
-    def setUp(self):
-        self.events = OnlineReader(ONLINE_FILE).events
-        self.lengths = {0: 18, 1: 53, -1: 9}
-        self.total_item_count = 80
-
-    def test_data_lengths(self):
-        hits = self.events.triggered_hits
-
-        for event_id, length in self.lengths.items():
-            assert length == len(hits[event_id].dom_id)
-            assert length == len(hits[event_id].channel_id)
-            assert length == len(hits[event_id].time)
-            assert length == len(hits[event_id].trigger_mask)
-
-    def test_total_item_counts(self):
-        hits = self.events.triggered_hits
-
-        assert self.total_item_count == sum(hits.dom_id.count())
-        assert self.total_item_count == sum(hits.channel_id.count())
-        assert self.total_item_count == sum(hits.time.count())
-
-    def test_data_values(self):
-        hits = self.events.triggered_hits
-
-        self.assertListEqual(
-            [806451572, 806451572, 808432835], list(hits.dom_id[0][:3])
-        )
-        self.assertListEqual([10, 13, 1], list(hits.channel_id[0][:3]))
-        self.assertListEqual([30733918, 30733916, 30733429], list(hits.time[0][:3]))
-        self.assertListEqual([16, 16, 4], list(hits.trigger_mask[0][:3]))
-
-    def test_channel_ids_have_valid_values(self):
-        hits = self.events.triggered_hits
-
-        # channel IDs are always between [0, 30]
-        assert all(c >= 0 for c in hits.channel_id.min())
-        assert all(c < 31 for c in hits.channel_id.max())
-
-
-class TestTimeslices(unittest.TestCase):
-    def setUp(self):
-        self.ts = OnlineReader(ONLINE_FILE).timeslices
-
-    def test_data_lengths(self):
-        assert 3 == len(self.ts._timeslices["L1"][0])
-        assert 3 == len(self.ts._timeslices["SN"][0])
-        with self.assertRaises(KeyError):
-            assert 0 == len(self.ts._timeslices["L2"][0])
-        with self.assertRaises(KeyError):
-            assert 0 == len(self.ts._timeslices["L0"][0])
-
-    def test_streams(self):
-        self.ts.stream("L1", 0)
-        self.ts.stream("SN", 0)
-
-    def test_reading_frames(self):
-        assert 8 == len(self.ts.stream("SN", 1).frames[808447186])
-
-    def test_str(self):
-        s = str(self.ts)
-        assert "L1" in s
-        assert "SN" in s
-
-
-class TestTimeslice(unittest.TestCase):
-    def setUp(self):
-        self.ts = OnlineReader(ONLINE_FILE).timeslices
-        self.n_frames = {"L1": [69, 69, 69], "SN": [64, 66, 68]}
-
-    def test_str(self):
-        for stream, n_frames in self.n_frames.items():
-            print(stream, n_frames)
-            for i in range(len(n_frames)):
-                s = str(self.ts.stream(stream, i))
-                assert re.match("{}.*{}".format(stream, n_frames[i]), s)
-
-
-class TestSummaryslices(unittest.TestCase):
-    def setUp(self):
-        for chunk in OnlineReader(ONLINE_FILE).summaryslices:
-            self.ss = chunk
-            break
-
-    def test_headers(self):
-        assert 3 == len(self.ss.headers)
-        self.assertListEqual([44, 44, 44], list(self.ss.headers.detector_id))
-        self.assertListEqual([6633, 6633, 6633], list(self.ss.headers.run))
-        self.assertListEqual([126, 127, 128], list(self.ss.headers.frame_index))
-        assert 806451572 == self.ss.slices[0].dom_id[0]
-
-    def test_slices(self):
-        assert 3 == len(self.ss.slices)
-
-    def test_fifo(self):
-        s = self.ss.slices[0]
-        dct_fifo_stat = {
-            808981510: True,
-            808981523: False,
-            808981672: False,
-            808974773: False,
-        }
-        for dom_id, fifo_status in dct_fifo_stat.items():
-            frame = s[s.dom_id == dom_id]
-            assert any(get_channel_flags(frame.fifo[0])) == fifo_status
-
-    def test_has_udp_trailer(self):
-        s = self.ss.slices[0]
-        dct_udp_trailer = {
-            806451572: True,
-            806455814: True,
-            806465101: True,
-            806483369: True,
-            806487219: True,
-            806487226: True,
-            806487231: True,
-            808432835: True,
-            808435278: True,
-            808447180: True,
-            808447186: True,
-        }
-        for dom_id, udp_trailer in dct_udp_trailer.items():
-            frame = s[s.dom_id == dom_id]
-            assert has_udp_trailer(frame.fifo[0]) == udp_trailer
-
-    def test_high_rate_veto(self):
-        s = self.ss.slices[0]
-        dct_high_rate_veto = {
-            808489014: True,
-            808489117: False,
-            808493910: True,
-            808946818: True,
-            808951460: True,
-            808956908: True,
-            808959411: True,
-            808961448: True,
-            808961480: True,
-            808961504: True,
-            808961655: False,
-            808964815: False,
-            808964852: True,
-            808969848: False,
-            808969857: True,
-            808972593: True,
-            808972598: True,
-            808972698: False,
-            808974758: False,
-            808974773: True,
-            808974811: True,
-            808974972: True,
-            808976377: True,
-            808979567: False,
-            808979721: False,
-            808979729: False,
-            808981510: True,
-            808981523: True,
-            808981672: False,
-            808981812: True,
-            808981864: False,
-            808982018: False,
-        }
-        for dom_id, high_rate_veto in dct_high_rate_veto.items():
-            frame = s[s.dom_id == dom_id]
-            assert any(get_channel_flags(frame.hrv[0])) == high_rate_veto
-
-    def test_max_sequence_number(self):
-        s = self.ss.slices[0]
-        dct_seq_numbers = {
-            808974758: 18,
-            808974773: 26,
-            808974811: 25,
-            808974972: 41,
-            808976377: 35,
-            808979567: 20,
-            808979721: 17,
-            808979729: 25,
-            808981510: 35,
-            808981523: 27,
-            808981672: 17,
-            808981812: 34,
-            808981864: 18,
-            808982018: 21,
-            808982041: 27,
-            808982077: 32,
-            808982547: 20,
-            808984711: 26,
-            808996773: 31,
-            808997793: 21,
-            809006037: 26,
-            809007627: 18,
-            809503416: 28,
-            809521500: 31,
-            809524432: 21,
-            809526097: 23,
-            809544058: 21,
-            809544061: 23,
-        }
-        for dom_id, max_sequence_number in dct_seq_numbers.items():
-            frame = s[s.dom_id == dom_id]
-            assert (
-                get_udp_max_sequence_number(frame.dq_status[0]) == max_sequence_number
-            )
-
-    def test_number_udp_packets(self):
-        s = self.ss.slices[0]
-        dct_n_packets = {
-            808451904: 27,
-            808451907: 22,
-            808469129: 20,
-            808472260: 21,
-            808472265: 22,
-            808488895: 20,
-            808488990: 20,
-            808489014: 28,
-            808489117: 22,
-            808493910: 26,
-            808946818: 23,
-            808951460: 37,
-            808956908: 33,
-            808959411: 36,
-            808961448: 28,
-            808961480: 24,
-            808961504: 28,
-            808961655: 20,
-            808964815: 20,
-            808964852: 28,
-            808969848: 21,
-        }
-        for dom_id, n_udp_packets in dct_n_packets.items():
-            frame = s[s.dom_id == dom_id]
-            assert get_number_udp_packets(frame.dq_status[0]) == n_udp_packets
-
-    def test_hrv_flags(self):
-        s = self.ss.slices[0]
-        dct_hrv_flags = {
-            809524432: [
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-            ],
-            809526097: [
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                True,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                True,
-                False,
-                False,
-                False,
-                False,
-            ],
-            809544058: [
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-            ],
-            809544061: [
-                False,
-                True,
-                False,
-                False,
-                False,
-                True,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                True,
-                False,
-                False,
-                False,
-                False,
-                False,
-                True,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-            ],
-        }
-        for dom_id, hrv_flags in dct_hrv_flags.items():
-            frame = s[s.dom_id == dom_id]
-            assert any(
-                [a == b for a, b in zip(get_channel_flags(frame.hrv[0]), hrv_flags)]
-            )
-
-    def test_fifo_flags(self):
-        s = self.ss.slices[0]
-        dct_fifo_flags = {
-            808982547: [
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-            ],
-            808984711: [
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-            ],
-            808996773: [
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-            ],
-            808997793: [
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-            ],
-            809006037: [
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-            ],
-            808981510: [
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                False,
-                True,
-                True,
-                False,
-                False,
-                False,
-                True,
-                False,
-                True,
-                True,
-                True,
-                True,
-                True,
-                True,
-                False,
-                False,
-                True,
-                False,
-            ],
-        }
-        for dom_id, fifo_flags in dct_fifo_flags.items():
-            frame = s[s.dom_id == dom_id]
-            assert any(
-                [a == b for a, b in zip(get_channel_flags(frame.fifo[0]), fifo_flags)]
-            )
-
-    def test_str(self):
-        print(str(self.ss))
-
-
-class TestGetChannelFlags_Issue59(unittest.TestCase):
-    def test_sample_summaryslice_dump(self):
-        fieldnames = ["dom_id"]
-
-        for i in range(31):
-            fieldnames.append(f"ch{i}")
-            fieldnames.append(f"hrvfifo{i}")
-
-        Entry = namedtuple("Entry", fieldnames)
-
-        with open(
-            data_path("online/KM3NeT_00000049_00008456.summaryslice-167941.txt")
-        ) as fobj:
-            ref_entries = [Entry(*list(l.strip().split())) for l in fobj.readlines()]
-
-        r = OnlineReader(
-            data_path("online/KM3NeT_00000049_00008456.summaryslice-167941.root")
-        )
-
-        for chunks in r.summaryslices:
-            summaryslice = chunks.slices[0]
-            break
-
-        for ours, ref in zip(summaryslice, ref_entries):
-            assert ours.dom_id == to_num(ref.dom_id)
-            fifos = get_channel_flags(ours.fifo)
-            hrvs = get_channel_flags(ours.hrv)
-            for i in range(31):
-                attr = f"ch{i}"
-                self.assertAlmostEqual(
-                    get_rate(getattr(ours, attr)) / 1000.0,
-                    to_num(getattr(ref, attr)),
-                    places=1,
-                )
-
-                hrvfifo = getattr(ref, f"hrvfifo{i}")
-                ref_hrv = bool(int(hrvfifo[0]))
-                ref_fifo = bool(int(hrvfifo[1]))
-                assert hrvs[i] == ref_hrv
-                assert fifos[i] == ref_fifo
-
-
-class TestGetRate(unittest.TestCase):
-    def test_zero(self):
-        assert 0 == get_rate(0)
-
-    def test_some_values(self):
-        assert 2054 == get_rate(1)
-        assert 55987 == get_rate(123)
-        assert 1999999 == get_rate(255)
-
-    def test_vectorized_input(self):
-        self.assertListEqual([2054], list(get_rate([1])))
-        self.assertListEqual([2054, 2111, 2169], list(get_rate([1, 2, 3])))
-
-
-class TestSummarysliceReader(unittest.TestCase):
-    def test_init(self):
-        sr = SummarysliceReader(data_path("online/km3net_online.root"))
-
-    def test_length(self):
-        sr = SummarysliceReader(data_path("online/km3net_online.root"))
-        assert 1 == len(sr)
-        sr = SummarysliceReader(data_path("online/km3net_online.root"), step_size=2)
-        assert 2 == len(sr)
-        sr = SummarysliceReader(data_path("online/km3net_online.root"), step_size=3)
-        assert 1 == len(sr)
-
-    def test_getitem_raises_when_out_of_range(self):
-        sr = SummarysliceReader(data_path("online/km3net_online.root"), step_size=1)
-        with self.assertRaises(IndexError):
-            sr[123]
-        with self.assertRaises(IndexError):
-            sr[-123]
-        with self.assertRaises(IndexError):
-            sr[3]
-        sr[-3]  # this should still work, gives the first element in this case
-        with self.assertRaises(IndexError):
-            sr[-4]
-
-    def test_getitem(self):
-        sr = SummarysliceReader(data_path("online/km3net_online.root"), step_size=1)
-        for idx in range(len(sr)):
-            assert len(sr[idx].headers) == 1
-            assert len(sr[idx].slices) == 1
-
-        first_frame_index = sr[0].headers.frame_index  # 126
-        last_frame_index = sr[2].headers.frame_index  # 128
-
-        assert 126 == first_frame_index
-        assert 128 == last_frame_index
-
-        sr = SummarysliceReader(data_path("online/km3net_online.root"), step_size=2)
-        assert len(sr[0].headers) == 2
-        assert len(sr[0].slices) == 2
-        assert len(sr[1].headers) == 1
-        assert len(sr[1].slices) == 1
-        with self.assertRaises(IndexError):
-            assert len(sr[2].headers) == 0
-            assert len(sr[2].slices) == 0
-
-        assert first_frame_index == sr[0].headers[0].frame_index
-        assert last_frame_index == sr[1].headers[0].frame_index
-
-        assert last_frame_index == sr[-1].headers[0].frame_index
-        assert first_frame_index == sr[-2].headers[0].frame_index
-
-    def test_iterate_with_step_size_one(self):
-        sr = SummarysliceReader(data_path("online/km3net_online.root"), step_size=1)
-        i = 0
-        for ss in sr:
-            i += 1
-        assert i == 3
-
-    def test_iterate_with_step_size_bigger_than_number_of_elements(self):
-        sr = SummarysliceReader(data_path("online/km3net_online.root"), step_size=1000)
-        i = 0
-        for ss in sr:
-            i += 1
-        assert i == 1
-
-    def test_iterate_gives_correct_data_slices(self):
-        sr = SummarysliceReader(data_path("online/km3net_online.root"), step_size=1000)
-
-        for ss in sr:
-            self.assertListEqual(
-                ss.slices[0].dom_id[:3].to_list(), [806451572, 806455814, 806465101]
-            )
-            self.assertListEqual(
-                ss.slices[0].dom_id[-3:].to_list(), [809526097, 809544058, 809544061]
-            )
-            assert len(ss.slices) == 3
-            assert len(ss.slices[0]) == 64
-            assert len(ss.slices[1]) == 66
-            assert len(ss.slices[2]) == 68
-            self.assertListEqual(ss.slices[0].ch5[:3].to_list(), [75, 62, 55])
-
-        sr = SummarysliceReader(data_path("online/km3net_online.root"), step_size=1)
-
-        lengths = [64, 66, 68]
-
-        for idx, ss in enumerate(sr):
-            # self.assertListEqual(ss[0].dom_id[:3].to_list(), [806451572, 806455814, 806465101])
-            # self.assertListEqual(ss[0].dom_id[-3:].to_list(), [809526097, 809544058, 809544061])
-            assert len(ss.slices) == 1
-            assert len(ss.slices[0]) == lengths[idx]
-            assert len(ss.slices[0].dom_id) == lengths[idx]
-            assert len(ss.slices[0].ch3) == lengths[idx]