diff --git a/notebooks/ANTARES_PointSource.ipynb b/notebooks/ANTARES_PointSource.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..1404aedebde1277249318bdffe4625e2e2d008f1
--- /dev/null
+++ b/notebooks/ANTARES_PointSource.ipynb
@@ -0,0 +1,360 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "# Dummy point-source study with ANTARES data\n",
+    "\n",
+    "The gammapy package is used to compute Feldman-Cousins statistics\n",
+    "\n",
+    "## Prerequisits\n",
+    "`pip install gammapy, matplotlib, openkm3`"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 9,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "import pandas as pd\n",
+    "import numpy as np\n",
+    "import matplotlib.pyplot as plt\n",
+    "import math\n",
+    "from scipy import stats\n",
+    "import gammapy.stats as gstats\n",
+    "import openkm3"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "## Helper functions\n",
+    "\n",
+    "### Background evaluation\n",
+    "The function *get_number_background(decl, roi)* permits to evaluate the expected background for a give sky declination and a given region of interest ('half width' of the declination band around the source declination). The right ascension is not used here, as the background distribution depends only on the visibility of the given sky position from the location of ANTARES, and therefore on the declination only.\n",
+    "\n",
+    "This function uses a polynomial interpolation to smoothen the background distibution otherwise obtained from the data. This function returns the number of expected background events that in the following will be indicated as nbkg."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 10,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "def get_number_background(interpolation, decl, roi):  \n",
+    "\n",
+    "    dec_low = decl - roi;\n",
+    "    dec_high = decl + roi;\n",
+    "    \n",
+    "    # estimation of background rate over the full declination band                                     \n",
+    "    bkg_low = interpolation(dec_low)\n",
+    "    bkg_high = interpolation(dec_high)\n",
+    "    bkg_band = bkg_high - bkg_low\n",
+    "    \n",
+    "    # solid angle of declination band\n",
+    "    solid_angle = abs(2*math.pi*(np.sin(math.radians(dec_high) - np.sin(math.radians(dec_low)))))\n",
+    "    \n",
+    "    # background rate per unid solid angle   \n",
+    "    bkg_rate = bkg_band / solid_angle \n",
+    "    \n",
+    "    # expected counts in the RoI\n",
+    "    bkg_count = bkg_rate * math.pi * pow(math.radians(roi), 2)\n",
+    "    return bkg_count"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {
+    "scrolled": true
+   },
+   "source": [
+    "### Limit calcuation (Felman-Cousins\n",
+    "The functions *get_upper_limit(nbkg, nobs)* and *get_lower_limit(nbkg, nobs)* compute the upper and lower limits in presence of nbkg backgroud events, if the measurements is nobs observed events inside the region of interest."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 11,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# compute upper limit according to Feldamn-Cousins\n",
+    "\n",
+    "def get_upper_limit(nbkg, nobs):\n",
+    "    \n",
+    "    if nobs <= int(nbkg):\n",
+    "        ul = Sens(nbkg)   #if underfluctuation, UL set as the sensitivity    \n",
+    "    else:\n",
+    "        x_bins = np.arange(0, 50)\n",
+    "        mu_bins = np.linspace(0, 15, 300 + 1, endpoint=True) # 300 = 15 / 0.05\n",
+    "        matrix = [stats.poisson(mu + nbkg).pmf(x_bins) for mu in mu_bins] # check nbkg!\n",
+    "        acceptance_intervals = gstats.fc_construct_acceptance_intervals_pdfs(matrix, 0.9)\n",
+    "\n",
+    "        LowerLimitNum, UpperLimitNum, _ = gstats.fc_get_limits(mu_bins, x_bins, acceptance_intervals)\n",
+    "        ul = UpperLimitNum\n",
+    "    \n",
+    "    return ul\n",
+    "\n",
+    "def get_lower_limit(nbkg, nobs):\n",
+    "    \n",
+    "    x_bins = np.arange(0, 50)\n",
+    "    mu_bins = np.linspace(0, 15, 300 + 1, endpoint=True) # 300 = 15 / 0.05\n",
+    "    matrix = [stats.poisson(mu + nbkg).pmf(x_bins) for mu in mu_bins] # check nbkg!\n",
+    "    acceptance_intervals = gstats.fc_construct_acceptance_intervals_pdfs(matrix, 0.9)\n",
+    "\n",
+    "    LowerLimitNum, UpperLimitNum, _ = gstats.fc_get_limits(mu_bins, x_bins, acceptance_intervals)\n",
+    "    ll = LowerLimitNum\n",
+    "    \n",
+    "    return ll"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Sensitivity calculation\n",
+    "\n",
+    "The function *get_sensitivity(nbkg)* is used to calculate the sensitivity, defined as average upper limit, that is reached in presence of nbkg events in the declination band of the source."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 12,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "confidence_level = 0.9 # set confidence level to 90%  \n",
+    "\n",
+    "def get_sensitivity(nbkg):\n",
+    "    x_bins = np.arange(0, 20)\n",
+    "    mu_bins = np.linspace(0, 200, 20000 + 1, endpoint=True) # 20000 = 200 / 0.01\n",
+    "    matrix = [stats.poisson(mu + nbkg).pmf(x_bins) for mu in mu_bins] # check nbkg!\n",
+    "    acceptance_intervals = gstats.fc_construct_acceptance_intervals_pdfs(matrix, confidence_level)\n",
+    "\n",
+    "    LowerLimitNum, UpperLimitNum, _ = gstats.fc_get_limits(mu_bins, x_bins, acceptance_intervals)\n",
+    "    averageUL = gstats.fc_find_average_upper_limit(x_bins, matrix, UpperLimitNum, mu_bins)\n",
+    "\n",
+    "    return averageUL "
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Picking number of observed events from full antares list\n",
+    "Retrieves the number of observed events from the eventlist for a given declination and region of interest.\n",
+    "\n",
+    "-> This function could actually be replaced using the pyvo interface, as the VO offers the SCS protocol to do exactly this."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 13,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "def get_number_observed_events(eventlist, decl, roi):\n",
+    "    obs = 0\n",
+    "    for counter in len(eventlist):   \n",
+    "        # fill vectors with RA and dec of each ANTARES track\n",
+    "        event = make_vector(eventlist.RA[counter], eventlist.Decl[counter])\n",
+    "        alpha = calc_angle(dir_source, event)\n",
+    "        \n",
+    "        # count events whose angle from the source falls inside the RoI\n",
+    "        if alpha < RoI:\n",
+    "            obs = obs+1\n",
+    "    return obs"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Math helpers\n",
+    "\n",
+    "The two functions *make_vector(Decl, RA)* and  *calc_angle(v1, v2)* are auxilary functions used later. \n",
+    "\n",
+    "*make_vector* fills a 3-dimensional vector using the sky coordinates of an event (Decl, RA), and assigning length 1. \n",
+    "\n",
+    "*calc_angle* computes the angular distance (in units degrees) between two points in the sky defined as the angle between their corresponding vectors v1 and v2."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 14,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "def make_vector(Decl, RA):    \n",
+    "    x = math.cos(math.radians(Decl))*math.cos(math.radians(RA))\n",
+    "    y = math.cos(math.radians(Decl))*math.sin(math.radians(RA))\n",
+    "    z = math.sin(math.radians(Decl))\n",
+    "    v =[x,y,z]\n",
+    "    return v \n",
+    "    \n",
+    "def calc_angle(v1, v2):\n",
+    "    unit_vector_1 = v1 / np.linalg.norm(v1)\n",
+    "    unit_vector_2 = v2 / np.linalg.norm(v2)\n",
+    "    dot_product = np.dot(unit_vector_1, unit_vector_2)\n",
+    "    angle_between_vectors = math.degrees(np.arccos(dot_product))\n",
+    "    return angle_between_vectors"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "## Calculating Limits\n",
+    "\n",
+    "Here, the flux upper limits for the observation of ANTARES events from the 2007-2017 sample for a given direction in the sky is calculated.\n",
+    "\n",
+    "### Setting the point source coordinates\n",
+    "The limits are calculated providing \n",
+    "* the source coordinates **right ascension** and **declination**, ra_source and dec_source, and \n",
+    "* the two additional source parameters **gamma** (spectral index) and \n",
+    "* **RoI** (region of interest, representing the source extension)."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 15,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "ra_source = 0.1 # between 0 and 360\n",
+    "dec_source = -29  # Declination must be between -80 and 50 degree\n",
+    "gamma = 2.0 # spectral index between 1.5 and 3.0\n",
+    "RoI = 2.6 # Region of interest > 0.1 and < 1000 (in degree)"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Getting ANTARES data and acceptance\n",
+    "This is a proposal of how retrieving the relevant ANTARES data should work.\n",
+    "The KM3Store should offer\n",
+    "* the event list\n",
+    "* the acceptance function\n",
+    "* the background estimate from interpolation\n",
+    "\n",
+    "The following functions are now dummies, but they can be added to work accordingly in the first release of openkm3.\n",
+    "\n",
+    "See http://pi1154.physik.uni-erlangen.de:82/data/collections/1/view for what is already provided"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# read the ANTARES data tracks into a dataframe - PROPOSAL!\n",
+    "ks = openkm3.KM3Store()\n",
+    "\n",
+    "ant07_17_collection = ks.get(\"ant2017\")\n",
+    "\n",
+    "ant07_17_collection.list()"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "should return a table e.g. \n",
+    "\n",
+    "| identifier | name | description | type |\n",
+    "| --------| --------- | ------- | ------- |\n",
+    "| \"events\" | ANTARES 2007-2017 neutrino events | Event list of neutrino candidates | km3.data.l4.vo.scs |\n",
+    "| \"acceptance\" | ANTARES 2007-2017 detector acceptance | acceptance table (for different spectral indices and sin(decl) | km3.data.l4.table.d2 |\n",
+    "| \"background_distribution\" | ANTARES 2007-2017 background distribution | interpolation of expected background events for given declination and region of interest | km3.data.l4.function.d6 |"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "eventlist = ant07_17_collection.get_table(\"events\", \"pandas\") \n",
+    "# would return the pandas eventlist, but perhaps better pyvo for this?\n",
+    "acceptance = ant07_17_collection.get_table(\"acceptance\", \"pandas\") \n",
+    "# would return the acceptance as table. \n",
+    "bkgfit = ant07_17_collection.get_function(\"background_distribution\") \n",
+    "# would return a function to be used for background calcuation"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "Could actually offer the acceptance table as lookup table with labeled rows and columns, so the right entry is easier to pick.\n",
+    "So more something like\n",
+    "`acceptance = ant07_17_collection.get_table(\"acceptance\", \"lookup\")`\n",
+    "\n",
+    "### Cut and count analysis"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "nevents_obs = get_number_observed_events(eventlist, decl, roi)\n",
+    "nevents_bkg = get_number_background(bkgfit, decl,  roi)\n",
+    "\n",
+    "print(\"estimated background: \",  nevents_bkg) \n",
+    "print(\"number of observed events: \",  nevents_obs)\n",
+    "\n",
+    "n_ul = get_upper_limit(nevents_bkg, nevents_obs)\n",
+    "n_ll = get_lower_limit(nevents_bkg, nevents_obs)\n",
+    "\n",
+    "# get bins declination and spectral index\n",
+    "bin_dec = int(math.sin(math.radians(dec_source))+1)*10;\n",
+    "bin_gamma = int((gamma*10)) - 15;\n",
+    "\n",
+    "# convert number upper limits into flux upper limits dividing by acceptance    \n",
+    "f_ul = n_ul/acceptance({\"gamma\": gamma, \"declination\": dec_source}) \n",
+    "f_ll = n_ll/acceptance({\"gamma\": gamma, \"declination\": dec_source}) \n",
+    "\n",
+    "# output \n",
+    "print(\"Upper Limit: \", n_ul) \n",
+    "#print(\"LL \", n_ll)\n",
+    "\n",
+    "print(\"flux Upper Limit (Normalized at GeV): \", f_ul , \" (GeV cm s)^-1\")\n",
+    "\n",
+    "print(\"flux Lower Limit (Normalized at 100 TeV) \", f_ul*pow(1e-5, gamma), \" (GeV cm s)^-1\")\n",
+    "\n",
+    "# if wished: plot\n",
+    "df.plot(kind='scatter', x='RA', y='Decl', alpha=0.1);\n",
+    "plt.show()"
+   ]
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.6.9"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
diff --git a/pages/FAIR.md b/pages/FAIR.md
index 55814aa9c45dc61c814804765f93eb6a18b7e4c1..112a6aa1ddadabc4bea6a3bc2240dbe2f197a45f 100644
--- a/pages/FAIR.md
+++ b/pages/FAIR.md
@@ -40,5 +40,5 @@ The FAIR principles provide a solid set of requirements for the development of a
 
 #### Reusable data
 * **[Licensing standards](https://open-data.pages.km3net.de/licensing/)** for data, software and supplementary material have been introduced.
-* Basic **provenance** information is provided with the data, which serves as a development start point to propagate provenance management through the complex the full data processing workflow in the future. 
+* Basic **provenance** information is provided with the data, which serves as a development start point to propagate provenance management through the complex data processing workflow in the future. 
 
diff --git a/pages/Usecase_ANTARES.md b/pages/Usecase_ANTARES.md
index 30bc3e19458344108d53cc4f401d86b11e103d39..de306712e8a41c5e449781dcfe829f843b5f680d 100644
--- a/pages/Usecase_ANTARES.md
+++ b/pages/Usecase_ANTARES.md
@@ -10,24 +10,23 @@ status: unedited
 
 # Example programs and Use cases
 
+## ANTARES 2007-2017 Point source analysis
+
+### Use case description
 One of the primary goals for ANTARES is the identification of neutrino sources, whose signature would appear in ANTARES data (in form of neutrino arrival directions) as clusters of events at given sky coordinates.
 The significance of a neutrino excess from a given sky position must be assessed over the expected background fluctuations using, for instance the Feldman-Cousins statistics.
 
 This ANTARES use case allows to inspect a sample of neutrino arrival directions in equatorial coordinates (RA, dec), evaluate from it the expected background rate for a user-selected sky position, and finally assess the significance of a cluster defined as 'all arrival directions that fall inside a given radius, selected by the user and indicated here 'region of interest' (RoI).
 
-Here follow the step-by-step description of the code provided at https://git.km3net.de/srgozzini/workflow_ps/-/blob/master/PS.ipynb
-
-- The function  *bkg_evaluation(decl, roi)* permits to evaluate the expected background for a give sky declination and a given region of interest ('half width' of the declination band around the source declination). The right ascension is not used here, as the background distribution depends only on the visibility of the given sky position from the location of ANTARES, and therefore on the declination only.
-This function uses a polynomial interpolation to smoothen the background distibution otherwise obtained from the data. This function returns the number of expected background events that in the following will be indicated as nbkg.
-
-- the functions *UL(nbkg, nobs)* and *LL(nbkg, nobs)* compute the upper and lower limits in presence of nbkg backgroud events, if the measurements is nobs observed events inside the region of interest.
-
-- the function *Sens(nbkg)* is used to calculate the sensitivity, defined as average upper limit, that is reached in presence of nbkg events in the declination band of the source.
-
-- the two functions *make_vector(Decl, RA)* and  *calc_angle(v1, v2)* are auxilary functions used later. *make_vector* fills a 3-dimensional vector using the sky coordinates of an event (Decl, RA), and assigning length 1. *calc_angle* computes the angular distance (in units degrees) between two points in the sky defined as the angle between their corresponding vectors v1 and v2.
+Here follow the step-by-step description of the code provided as [Jupyter notebook](notebooks/A01_recorded_rate)
 
-- The function *compute_limits(ra_source, dec_source, gamma, RoI)* is the main function of this script and computes the flux upper limits for a given direction in the sky defined from the coordinates ra_source and dec_source, and for the two additional source parameters gamma (spectral index) and RoI (region of interest, representing the source extension).
+### Data set
 
-## Using ANTARES data
+ANTARES data has already been published to the VO for two data sets by using the services of the German Astrophysical Virtual Observatory (GAVO) which run the DaCHS software. The most recent public data sample of the 2007-2017 point source search is available through the ANTARES website, however, it is thus not findable through VO and does not match the FAIR criteria. Including ANTARES data in the development of future VO data types allowes to increase the chance for a long-term availability of high-quality ANTARES data. On the other hand, the KM3NeT VO server could be registered to the VO and protocols be tested using the ANTARES 2007-2017 sample. 
 
-ANTARES data has already been published to the VO for two data sets by using the services of the German Astrophysical Virtual Observatory (GAVO) which run the DaCHS software. The most recent public data sample is available through the ANTARES website, however, it is thus not findable through VO and does not match the FAIR criteria. Including ANTARES data in the development of future VO data types allowes to increase the chance for a long-term availability of high-quality ANTARES data. On the other hand, the KM3NeT VO server could be registered to the VO and protocols be tested using the ANTARES 2007-2017 sample. 
+The provided data set includes
+* The **full event list** of 2007-2017 selected astrophysics neutrino candidates, provided through the VO server,
+* Supplementary distributions from simulations provided via the ODC including
+  * the **detector acceptance** for a given source spectral index and declination
+  * the interpolated **distribution of background events** for a given declination and region of interest
+  * the **effective area** for an E^-2 source spectrum in three different zenith bands