Data storage and memory#

As you produce more notebooks and environments in Anaconda Notebooks, you may begin to run out of storage and find that processes slow down. This topic explains how data storage and memory usage work in Anaconda Notebooks and provides instructions to resolve these issues.

Anaconda Notebooks provides varying levels of cloud storage and CPU “high-compute” seconds based on your Anaconda tier level.

What is a high-compute second?

A CPU second is one second of running code on a single CPU core at 100%. We refer to them as “high-compute seconds” on our pricing page to clearly distinguish CPU seconds from real-world seconds. Simply running JupyterLab, writing code, and using the interface don’t really use up quota (though they have a small impact). Only running Python code from within a notebook and running commands from the terminal count against your quota, and even then very few command functions truly tax the CPU.

For example, if your code makes an HTTP request, then it will use a tiny amount of CPU time assembling the request and sending it out over the network, but will then use no CPU at all while it’s waiting for a response. When the response comes back from the other end, then it will again use a small amount of CPU to interpret the response and provide your code with the results. So, in general, CPU time is only used while your program is actively making calculations, not while it is waiting for other systems.

When does the clock on CPU seconds reset?

Our notebook service accounts have a per-day limit for the maximum number of seconds fully utilizing the CPU. Once an instance hits that limit, it is not shut down, but instead given lower CPU priority and a limit to the amount of compute resources available. This limit is reset every day, so full compute access will be restored the next day.





Fast, backed-up SSD storage




CPU seconds (daily)




Data storage#

You can monitor your available cloud storage space with the Disk Usage meter at the top of the screen. If you’re maxing out your storage space, the most likely culprits are custom environments, which are all the environments you see in the Disk Manager. To gain more storage, delete these environments (and optionally download them for later use) or upgrade your subscription.

What kind of storage does Anaconda Notebooks use?

Anaconda Notebooks uses persistent Elastic Block Store (EBS) storage: a fast, backed-up, SSD storage that supports common data science and machine learning workloads. EBS storage is generally faster and more reliable than most cloud-hosted options.

Managing disk usage#

You can view and manage all your files by clicking anywhere on the Disk Usage meter to open the Disk Manager:

1 - Clear Cache

Anaconda recommends clearing your cache on a regular basis to save on space.

2 - Reset…

Restores all selected items to their default state. Select Download items before deleting to ensure you don’t lose valuable work.

3 - Download

Downloads selected files. Consider downloading valuable files before deleting.

4 - Delete

Deletes selected files.

5 - File Name

Hovering over the File Name column header exposes a filter tool to further sort through your files. Clicking File Name orders (and reverses the order of) files alphabetically.

6 - Size

Clicking Size orders (and reverses the order of) files by size.

Removing custom environments#


Creating custom environments consumes a large amount of storage. Free tier users are recommended to avoid complex environment building and to limit this to one environment at a time. Upgrade to work with more custom environments.

  1. In a terminal within Anaconda Notebooks, run conda env list and see if there are any environments NOT in /opt/conda.

  2. If there are, you can remove those unwanted environments in the Disk Manager as shown in the previous section, or by running:

    # Replace <ENV_NAME> with the environment name
    conda env remove -n <ENV_NAME>
  3. Further, clear out the cache and other artifacts by clicking Clear Cache in the Disk Manager as shown in the previous section, or by running:

    conda clean --all
    pip cache purge
    rm -rf /tmp/*


If your notebook is running slowly, you may have exceeded your CPU usage limit for the day. You will still be able to work when this happens, but the performance will be affected (e.g loading a .csv file with pandas may take 10 seconds instead of half a second). The limit resets daily.

Anaconda Notebooks limits processes to 6GB of RAM per kernel. Exceeding this limit terminates your process, at which point you will need to restart your kernel. If you need to run larger processes, please contact us at

To see current progress towards your daily quota, refer to the CPU Usage widget at the top of Anaconda Notebooks.

To better manage your CPU usage, regularly check the Running Terminals and Kernels tab in the left sidebar and shut down unnecessary kernels when you no longer need them.