Supported by the National Science Foundation Collaborator: University of Michigan Collaborator: Michigan State University Collaborator: Wayne State University Collaborator: Merit

Time to Process

If you're looking for extra computing power, you've come to the right place.

HORUS is the high-performance computing fast lane for Michigan universities and community colleges. Researchers and students can tap into computing power typically only available to the state's top research universities.

Funded by the National Science Foundation, HORUS (Helping Our Researchers Upgrade their Science) is a collaboration of the University of Michigan, Michigan State University, Wayne State University, and Merit Network. It builds on the work done under the OSiRIS project, also funded by the National Science Foundation, to provide large-scale scientific data storage. It takes advantage of the high-speed research network created for OSiRIS, as well as Merit's state-of-the-art fiber-optic network.

By combining OSiRIS storage with HORUS processing power, more of Michigan's scientific minds can realize the potential of their ideas. It is free to use for members of public higher education institutions, those participating in NSF funded science projects and others involved in research and education. If you are interested in using HORUS, but unsure whether you qualify, please feel free to contact us horus-help@umich.edu, introducing yourself and your proposed work

Users from participating institutions can log in using their existing credentials associated with their home institutions. HORUS is also available to institutions around the country via the NSF grant sharing mandate, where up to 20% of HORUS resources are accessible via OSG/PATh.

HORUS provides three distinct types of computational nodes, co-located with existing OSiRIS storage infrastructure, well-connected to an existing 100 Gbps research network and leveraging a set of open source software to make the resource broadly available, including to researchers outside the region via the OSG/PATh sharing described above.

HORUS aims to:

  • Provide computing power to under-resourced researchers
  • Accelerate scientific discovery at universities, colleges, and community colleges in Michigan and surrounding regions
  • Give easy access to the diverse computing and storage resources needed for scientific research and analysis
  • Remove barriers to the open science grid and serve as an on-ramp to national resources
  • Introduce students to high-performance computing, data science, and artificial intelligence
  • Support collaboration among community colleges with certificate programs in data science
  • Encourage collaboration between research universities and the broader research community
  • Promote new course development

We are currently seeking pilot users for project HORUS. Request more information HERE.

Educators

Students, postdocs, teachers: Access to HORUS resources accelerates scientific discovery and boosts data science programs.

Administrators

HORUS is a gateway to the open science grid. There is no charge to use HORUS, which is funded by the National Science Foundation. Contact the project leadership team if your institution is interested in collaborating.

Community Colleges

HORUS opens up a world of powerful resources for science students. For educators and administrators, HORUS can support data science certificate programs and collaborations with other institutions. New users welcome. For a large class or more complex needs, contact the project team.

Sample use cases

  • Material simulation applications using hundreds of 1-2 TB files to do computation
  • Inter-institution research that calls for greater computational and data resources
  • Genomics data analysis requiring a mix of central VAI computation mixed with those of individual researchers
  • Centralized compute resource and data storage for single particle cryo-electron microscopy
  • Evolutionary genomics research which requires large-scale simulations and analyses of empirical data.
  • Processing network telescope data for 1) longitudinal studies, 2) performing annotations with external, third-party data sources (e.g., Censys.io, historical DNS data, etc.), 3) and employing ML/AI/Statistical techniques for data clustering, predictive analytics, inference, anomaly detection and other applications
  • Community College Data Science Certificate programs that require access to easy to use compute, storage and software resources to provide a broad introduction to the exploding field of data science

Dec 10, 2025 - HORUS status and plans

The HORUS project team is undertaking a range of renovations and upgrades to hosts, services, and tooling to improve reliability, security, and performance for current users. Many of these changes are ongoing and are intended to reduce incidents and simplify operations in the coming months.

One exciting new capability we expect to make available to registered users in about one month is a dedicated LLM (large language model) focused on HORUS and OSiRIS. This tool will help users with questions about how to use resources, debug common problems, and optimise workflows. It is intended as a quick first-response aid to point users toward likely fixes and useful documentation — not as a replacement for our human support staff. For complex or unresolved issues, please continue to open support tickets so the team can follow up.

We are continuing the migration of the remaining hosts to AlmaLinux 9. Examples of systems being updated include our perfSONAR nodes, the project Wiki, the LDAP server, the ELK (Elasticsearch/Logstash/Kibana) system, and the CoManage deployment. These upgrades are focused on improving long-term supportability and security posture.

In the slightly longer timeframe we plan to upgrade Puppet from 7.34 to Puppet 8.7+; this upgrade will be coordinated and tested to avoid disruption to production services. We will share more details and maintenance windows as the upgrade schedule is finalized.

We also continue routine maintenance and updates of the OSiRIS Ceph storage deployment, which currently provides just under 12 petabytes of raw capacity. Ongoing work ensures the cluster remains healthy and performant for user workloads.

All of these efforts are intended to help enable a possible extension of HORUS and OSiRIS service availability beyond August 2026. At the same time, we need to be clear that the current plan still anticipates a shutdown of services on August 31, 2026. We will communicate any changes to those plans as they are confirmed.

If you have questions about these changes, need assistance, or would like early access to the LLM when it becomes available, please contact the HORUS support team or open a support ticket.

Sep 11, 2025 - OSiRIS and HORUS Service timeline and migration notice

Summary OSiRIS and HORUS will stop operations at the end of August 2026. Users with data or workloads on these platforms should plan and complete migrations well before that date.

Background OSiRIS (a National Science Foundation DIBBS project) provided software-defined storage for research in the Michigan region from September 2015 through August 2021 (with a one-year no-cost extension). HORUS (a CC* Regional Computing project) added compute resources to support OSiRIS users and the broader research community, operating from September 2022 through August 2025 (with a one-year no-cost extension). Since their formal funding periods ended, both projects have continued in a best-effort mode by their host institutions.

Important timeline

  • Operations for both OSiRIS and HORUS are scheduled to end on 31 August 2026.
  • After that date, data access and compute services will no longer be available.

What users should do now

  • Review any data you currently store on OSiRIS. If you need to retain it, plan and begin migrating it to alternative storage well before August 2026.
  • For HORUS users, migrate code, configurations, and any persistent data to another compute platform prior to the end-of-service date.
  • If you do not need your data, please remove it or contact the team for assistance with deletion to avoid unnecessary storage of unused data.

Resources

  • OSiRIS: https://www.osris.org/
  • HORUS: https://horus-ci.org/

Contact For questions or assistance with migration, email: horus-help@umich.edu

Thank you, Shawn McKee for the OSiRIS and HORUS projects

Feb 12, 2025 - HORUS / OSiRIS update — new lg nodes, Ceph upgrade, and service reminder

Just two quick announcements.

1) We have added 7 new nodes to the HORUS SLURM cluster. These are large memory and large CPU hosts (1.5 TB of RAM, 384 logical CPUs, 2x100Gbps network) and are part of a new SLURM partition named ‘lg’. 2) We are updating the OSiRIS Ceph from 18.2.4 to 19.2.1.

A reminder: we are running the infrastructure in a best-effort way until August 2026. For users that have a lot of data in OSiRIS, it may be worthwhile to start planning where it will go after August 2026.

Please consider deleting data you no longer need or use.

If you have any questions or concerns, you can contact us at horus-help ‘at’ umich.edu.

Shawn McKee for the HORUS and OSiRIS projects

Oct 6, 2023 - HORUS Fall 2023 Webinar

The HORUS webinar for Fall 2023 will take place on October 25, 2023 at 9 AM Eastern.

Please see the details at the Merit Webinar Registration site.

Jul 25, 2023 - OSiRIS Updates for HORUS

As the basis for the HORUS storage infrastructure, we rely upon OSiRIS for a reliable, resilient platform providing large-scale high performance storage. OSiRIS currently provides over 1400 disks and 12 PB of raw storage space. However, OSiRIS was originally deployed on a base operating system of CentOS 7 which is reaching End-of-Life (EOL) in June 2024. Also, some of the tools, libraries and components were in need up upgrades to address security and functionality issues. One of our main upgrades beyond the operating system, was PHP, which needed to be brought up to version 8.2. Many of the upgrades introduced compatibilty issues and we need to work closely with various software providers to maintain functionality, while fixing security issues. This work has taken most of the HORUS team’s effort since the equipment was deployed and also required “best effort” help from the remaining OSiRIS team members.

However we are happy to report success by the end of July and the team is now focused enabling a smooth onboarding process for our early adopters.