Need help?

Our expert staff is here to provide support.

Summary of Storage Services

University of Delaware’s Research Office Core Facilites

Propose registering all HPC and Storage resources to build upon the existing Research Office’s Core Facilities page https://research.udel.edu/core-facilities/.

EFS (Engineering File System)

This service is available to all Departments, Research Groups, and Faculty members within the College of Engineering and their collaborators (inside and outside UD) for research related data. This service is not intended for individual data backups.

As a faculty member or researcher, you understand the importance of storing and backing up your crucial data. Introducing Engineering File Services (EFS) – a free (EFS provides 1TB of free (forever) storage per faculty member or research group. Additional storage can be purchased in 1TB increments for a nominal one-time fee of $500), expandable data storage and backup solution, with inbuilt disaster recovery capabilities. What makes EFS different from Google Drive, Dropbox or other cloud based storage? With EFS your data remains on-campus, within our high-speed network and is, therefore, capable of handling large datasets that would otherwise be impossible for a third party cloud storage solution – access your data over 2.5x faster on EFS than any cloud provider.

EFS provides two types of storage: SAS Platter Drives and SAS Solid State Drives

  • Total Users to date: 488
EFS has enabled a new compression feature that has lowered actual storage usage.
  • Total Storage: 130.89 TB
  • Allocated Physical Capacity: 115.32 TB
  • Un-Allocated Physical Capacity: 15.55 TB
  • Allocated Actual Compressed File Space Used: 64.158 TB
  • Allocated but available space: 54.8 TB
  • Shares allocated space for all shares: 168 TB

Globus

As part of an NSF Cyberinfrastructure grant, UD IT purchased a Data Transfer Node (DTN) with 10TiB scratch space to help facilitate moving data between UD and non-UD systems (e.g. XSEDE clusters) to support researchers. The DTN node will be available to all UD PI’s once its implementation is completed and tested.

HPC Cluster (Farber & Caviness)

The University of Delaware currently has two high-performance computing (HPC) community clusters: Farber and Caviness. The Mills community cluster is no longer available for computing as of July 1, 2019. These clusters were built on the recommendation of a 2011 UD Research Computing Task Force that suggested the University create a large, broadly available HPC cluster. Participation from all Colleges except Education & Human Development. Storage is provided for home, workgroup, and scratch (temporary). This is the only dedicated storage for Research and it is based on buy-in by research groups, it is not generally available to all at UD.

Combined across Farber + Caviness (as of 2019-07-03)

  • Total Users: 1023
  • Total Storage available on ZFS+NFS home and workgroup: 388 TB
  • Total Storage used on ZFS+NFS home and workgroup: 144 TB
  • Total Storage available on Lustre scratch: 469 TB
  • Total Storage used on Lustre scratch: 155 TB

OneDrive

Everyone at UD gets a OneDrive account, general availability began 1/16/2018. Usage rules/description is the same as SharePoint.

  • Total Users in past 30 days: 691
  • Total Storage Available per user: 5 TB
  • Total Storage Available for 691 users over the past 30 days = 5 TB * 691 = 3455 TB
  • Total Storage Used: 866 GB

OURRstore

This is an MRI grant from OU providing a long term tape archival system to be available to UD researchers in late 2019. OU is waiting on responses from vendors for RFP as of 9/3/2019. The service will be provided by OU & Regional Research Store (OURRstore). This will be a large scale (many PB), long term (8+ year), multi-copy tape, multi-institution with the intent to allow hundreds of students, staff and faculty to:

  • Building large and growing data collections
  • Share and publish datasets, making them discoverable and searchable
  • Include EPSCoR states and territories(and others) nationwide

Non-constraints

  • NO REQUIREMENT that there be a relevant funded project (regardless of funding agency).

Constraints

  • NO legally regulated data (HIPAA, CUI, ITAR/EAR, etc).
  • NO CLINICAL DATA (but STEM research data is fine)
  • Minimum file size: 1 GB
  • Maximum file size: 10% of the raw capacity of the tape cartridge (so, for LTO-7 “Type M,” 10% of 9 TB = 900 GB for maximum file size)
  • Recommended file size: 20 – 200 GB
  • Must store dual or triple copies (secondary copies will be shipped back to the file owner)
  • NEVER EVER BACKED UP: The tape cartridges you buy are the ONLY copies we have of your files. (We do back up the file catalog and other metadata, but each user/PI is responsible for providing tape cartridges for their own secondary copies of their own data, which we’ll ship back to them using a prepaid shipping label that they provide.)

University Data Backup Server

  • Total Storage Available: 110 TB
  • Total Storage Used: 110 TB

Storage is recycled daily after backups are moved from disk storage to tape storage.

Infrastructure as a Service/IaaS (Private Cloud Infrastructure on VMware)

Central IT virtualization service provides departments with more flexibility than buying a physical server. There are no up-front costs, and departments are not locked into one configuration; the virtual server configuration, and associated costs, can be changed as a department’s needs increase or decrease. Virtual servers can be configured with various CPU, memory, and storage quantities, and the choice of one of several operating systems.

Virtual machine storage is serviced by a pair of Network Appliance All Flash FAS 8040 systems configured as a production system in one data center with data mirrored to a disaster recovery system in another data center.

This is a core central service available to the entire University community. UD IT does not manage all the virtual machines which take advantage of this service, and as such, a count of users is not available.

  • Total Users: not available
  • Total Storage Available: 85,166 GiB = 83.17 TiB
  • Total Storage Used: 40,595 GiB = 49.41 TiB