Metadata-Version: 2.1
Name: accession
Version: 2.1.0
Summary: Tool to submit genomics pipeline outputs to the ENCODE Portal
Home-page: https://github.com/ENCODE-DCC/accession
Author: Paul Sud
Author-email: encode-help@lists.stanford.edu
License: MIT
Project-URL: Documentation, https://accession.readthedocs.io/en/latest/
Project-URL: Source Code, https://github.com/ENCODE-DCC/accession
Project-URL: Issue Tracker, https://github.com/ENCODE-DCC/accession/issues
Platform: UNKNOWN
Classifier: License :: OSI Approved :: MIT License
Classifier: Natural Language :: English
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Requires-Python: >=3.6
Description-Content-Type: text/x-rst
Requires-Dist: requests
Requires-Dist: encode-utils (>=2.9.0)
Requires-Dist: google-cloud-storage
Requires-Dist: attrs
Requires-Dist: boto3 (==1.13.5)
Requires-Dist: caper (==1.0.0)
Requires-Dist: google-cloud-tasks (==1.5.0)
Requires-Dist: google-auth (==1.18.0)
Requires-Dist: google-api-core (==1.21.0)
Requires-Dist: typing-extensions (==3.7.4.2)
Provides-Extra: dev
Requires-Dist: sphinx ; extra == 'dev'
Requires-Dist: pytest ; extra == 'dev'
Requires-Dist: pytest-cov ; extra == 'dev'
Requires-Dist: pytest-mock ; extra == 'dev'
Requires-Dist: docker ; extra == 'dev'
Requires-Dist: pre-commit ; extra == 'dev'
Provides-Extra: docs
Requires-Dist: sphinx ; extra == 'docs'
Provides-Extra: tests
Requires-Dist: pytest ; extra == 'tests'
Requires-Dist: pytest-cov ; extra == 'tests'
Requires-Dist: pytest-mock ; extra == 'tests'
Requires-Dist: docker ; extra == 'tests'

==============
accession
==============

.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
    :target: https://github.com/ambv/black
    :alt: Code Style: Black

.. image:: https://img.shields.io/badge/License-MIT-blue.svg
   :target: https://lbesson.mit-license.org/
   :alt: License: MIT

.. image:: https://circleci.com/gh/ENCODE-DCC/accession.svg?style=svg
    :target: https://circleci.com/gh/ENCODE-DCC/accession
    :alt: CircleCI status

.. short-intro-begin

``accession`` is a Python module and command line tool for submitting genomics pipeline analysis output files and metadata to the ENCODE Portal.

.. _installation:

Installation
=============

Note: intallation requires Python >= 3.6

.. code-block:: console

    $ pip install accession

Next, provide your API keys from the ENCODE portal:

.. code-block:: console

    $ export DCC_API_KEY=XXXXXXXX
    $ export DCC_SECRET_KEY=yyyyyyyyyyy

You will also need to authenticate with Google Cloud if using WDL metadata from pipeline runs on Google Cloud. Run the following two commands and follow the prompts:

.. code-block:: console

    $ gcloud auth login --no-launch-browser
    $ gcloud auth application-default login --no-launch-browser

| In addition, it is highly recommended to set the DCC_LAB and DCC_AWARD environment
  variables for ease of use. These correspond to the lab and award identifiers given by
  the ENCODE portal, e.g. ``/labs/foo/`` and ``U00HG123456``, respectively.

.. code-block:: console

    $ export DCC_LAB=XXXXXXXX
    $ export DCC_AWARD=yyyyyyyyyyy

| If you would like to be able to pass Caper workflow IDs or labels you will
  need to configure access to the Caper server. If you are invoking ``accession`` from
  a machine where you already have a Caper set up, and you have the Caper configuration
  file available at ``~/.caper/default.conf``, then there is no extra setup required.
  If the Caper server is on another machine, you will need so configure HTTP access to
  it by setting the ``hostname`` and ``port`` values in the Caper conf file.

| (Optional) Finally, to enable using Cloud Tasks to upload files from Google Cloud
  Storage to AWS S3, set the following two environment variables. If one or more of them
  is not set, then files will be uploaded using the same machine that the accessioning
  code is run from. For more information on how to set up Cloud Tasks and the upload
  service, see the docs for the `gcs-s3-transfer-service <https://github.com/ENCODE-DCC/gcs-s3-transfer-service/>`_

.. code-block:: console

    $ export ACCESSION_CLOUD_TASKS_QUEUE_NAME=my-queue
    $ export ACCESSION_CLOUD_TASKS_QUEUE_REGION=us-west1

Usage
======

.. code-block:: console

    $ accession -m metadata.json \
                -p mirna \
                -s dev

Please see the `docs <https://accession.readthedocs.io/en/latest/#detailed-argument-description>`_ for greater detail on these input parameters.

.. short-intro-end

Project Information
====================

``accession`` is released under the `MIT <https://choosealicense.com/licenses/mit/>`_ license, documentation lives in `readthedocs <https://accession.readthedocs.io/en/latest/>`_, code is hosted on `github <https://github.com/ENCODE-DCC/accession>`_ and the releases on `PyPI <https://pypi.org/project/accession/>`_.


