|
|

|
|
|
|
|
|
[[_TOC_]]
|
|
|
|
|
|
## Space Physics Data Repository
|
|
|
|
|
|
The Space Physics Data Repository core systems were purchased with a grant from the [Roy J. Carver Charitable Trust](https://www.carvertrust.org/) and are located at the UIowa [Information Technology Facility (ITF)](https://www.facilities.uiowa.edu/building/0290).
|
|
|
|
|
|
This Space Physics Analysis Computing Environment (now, wouldn't that have been a better acronym :wink: ) will be available to support computing and data storage needs for all space physics researchers in the [Department of Physics and Astronomy](https://physics.uiowa.edu) and their collaborators. In particular, all operations formerly supported by systems in 708 VAN have been migrated to this new facility.
|
|
|
|
|
|
Be sure to understand the [terms of service](https://space.physics.uiowa.edu/spdr/). In particular, please [acknowledge the Carver Trust](https://space.physics.uiowa.edu/spdr/carver-acknowledgement.html) in any associated publications and presentations.
|
|
|
|
|
|
The systems are co-managed by [CLAS Linux](https://clas.uiowa.edu/linux/) with direct management tasks being performed by [Larry Granroth](https://space.physics.uiowa.edu/\~ljg/) @ljg. (Dani @dgcrawfo will be added when she has time to deal with it.) The systems are fully integrated with the university Active Directory, so you login with your hawkid and password. Note that accounts must first be enabled by the SA. Feel free to contact Larry directly with any requests or questions.
|
|
|
|
|
|
The systems initially will be configured with [Rocky Linux 8](https://rockylinux.org/) with software and file organization very similar to the [RPWG](https://space.physics.uiowa.edu/plasma-wave/) systems managed by Larry. Similarly, a login on any system will provide access to all the same software, user, and data directories. If you need to scale up to utilize more compute resources, you only need to run on multiple servers.
|
|
|
|
|
|
The systems at the ITF in Oakdale were the original "SPDR" (pronounced spider) systems, although now we use that term to refer to all of the integrated systems, workstations or servers, configured to share the same login directories, data directories, and software resources. Our current plan is to migrate CentOS 7 workstations to Rocky Linux 9, making them compatible with the newest applications.
|
|
|
|
|
|
## Servers
|
|
|
|
|
|
Capabilities include 432 cpu threads and almost 1.8 PB data storage, all uniformly available via any login. Following tradition, the servers are named after planets, primarily fictional planets from Star Trek. View [DCIM](https://dcim.its.uiowa.edu/cabnavigator.php?cabinetid=37) information (physical location, configuration, etc), if authorized. [See photos](Photos).
|
|
|
|
|
|
If you have large processing jobs to run, please use the C-H machines (Cestus, Draylax, Enara, Farias, Gothos, Harlak). The A-B machines (Axanar, Binus) are primarily for routine interactive work, and Saturn hosts the core web services.
|
|
|
|
|
|
| Server | Status | Threads | Speed | User Space | Data Space |
|
|
|
|--------|--------|---------|-------|------------|------------|
|
|
|
| Axanar | available | 48 | 4GHz | 20TB | 184TB |
|
|
|
| Binus | available | 48 | 4GHz | 20TB | 184TB |
|
|
|
| Cestus | available | 48 | 4GHz | 3.5TB | 230TB |
|
|
|
| Draylax | available | 48 | 4GHz | 3.5TB | 230TB |
|
|
|
| Enara | available | 48 | 4GHz | 3.5TB | 230TB |
|
|
|
| Farias | available | 48 | 4GHz | 3.5TB | 230TB |
|
|
|
| Gothos | available | 48 | 4GHz | 3.5TB | 230TB |
|
|
|
| Harlak | available | 48 | 3GHz | 900GB | 146TB |
|
|
|
| Saturn | available | 48 | 4GHz | 20TB | 126TB |
|
|
|
|
|
|
## Workstations
|
|
|
|
|
|
Workstations are in the process of being updated from CentOS 7 to Rocky Linux 9. The same file systems will be available as will nearly identical software environments.
|
|
|
|
|
|
## [Migration Notes](Migration-Notes)
|
|
|
|
|
|
The current page is about the SPDR environment. Follow the link above for notes on the migration from the legacy systems.
|
|
|
|
|
|
## [Logins](Graphical-Logins)
|
|
|
|
|
|
Follow the link above for information about logging in and especially for information about interactive graphical sessions.
|
|
|
|
|
|
## [Near-Real-Time System Status](https://fosstodon.org/tags/SPDRstatus)
|
|
|
|
|
|
In much the way we used Twitter in the past to track current system issues, we will now use the Mastodon social network link https://fosstodon.org/tags/SPDRstatus. Twitter no longer allows access to content unless you are actively logged-in, making it worthless for our purposes. One advantage of the Fediverse (Mastodon) system is that users other than me @ljg can also report status simply by beginning their post with the string "#SPDRstatus: ". (You can actually have the hashtag anywhere in the message, but let's stick to this convention to make it easier to ignore unrelated potential clashes.) Remember that this is a public forum, so don't divulge anything that would compromise security.
|
|
|
|
|
|
## [File System Organization](File-Organization)
|
|
|
|
|
|
This explains the file system organization and how it should be used, for both project and individual use.
|
|
|
|
|
|
## [AD Group Issues](AD-Group-Issues)
|
|
|
|
|
|

|
|
|
|
|
|
[[_TOC_]]
|
|
|
|
|
|
## Space Physics Data Repository
|
|
|
|
|
|
The Space Physics Data Repository core systems were purchased with a grant from the [Roy J. Carver Charitable Trust](https://www.carvertrust.org/) and are located at the UIowa [Information Technology Facility (ITF)](https://www.facilities.uiowa.edu/building/0290).
|
|
|
|
|
|
This Space Physics Analysis Computing Environment (now, wouldn't that have been a better acronym :wink: ) will be available to support computing and data storage needs for all space physics researchers in the [Department of Physics and Astronomy](https://physics.uiowa.edu) and their collaborators. In particular, all operations formerly supported by systems in 708 VAN have been migrated to this new facility.
|
|
|
|
|
|
Be sure to understand the [terms of service](https://space.physics.uiowa.edu/spdr/). In particular, please [acknowledge the Carver Trust](https://space.physics.uiowa.edu/spdr/carver-acknowledgement.html) in any associated publications and presentations.
|
|
|
|
|
|
The systems are co-managed by [CLAS Linux](https://clas.uiowa.edu/linux/) with direct management tasks being performed by [Larry Granroth](https://space.physics.uiowa.edu/\~ljg/) @ljg. (Dani @dgcrawfo will be added when she has time to deal with it.) The systems are fully integrated with the university Active Directory, so you login with your hawkid and password. Note that accounts must first be enabled by the SA. Feel free to contact Larry directly with any requests or questions.
|
|
|
|
|
|
The systems initially will be configured with [Rocky Linux 8](https://rockylinux.org/) with software and file organization very similar to the [RPWG](https://space.physics.uiowa.edu/plasma-wave/) systems managed by Larry. Similarly, a login on any system will provide access to all the same software, user, and data directories. If you need to scale up to utilize more compute resources, you only need to run on multiple servers.
|
|
|
|
|
|
The systems at the ITF in Oakdale were the original "SPDR" (pronounced spider) systems, although now we use that term to refer to all of the integrated systems, workstations or servers, configured to share the same login directories, data directories, and software resources. All workstations are currently running Rocky Linux 9, as is the server Harlak, where the most up-to-date versions of applications will be available. Migration of the other servers will occur in the future.
|
|
|
|
|
|
## Servers
|
|
|
|
|
|
Capabilities include 432 cpu threads and almost 1.8 PB data storage, all uniformly available via any login. Following tradition, the servers are named after planets, primarily fictional planets from Star Trek. View [DCIM](https://dcim.its.uiowa.edu/cabnavigator.php?cabinetid=37) information (physical location, configuration, etc), if authorized. [See photos](Photos).
|
|
|
|
|
|
If you have large processing jobs to run, please use the C-H machines (Cestus, Draylax, Enara, Farias, Gothos, Harlak). The A-B machines (Axanar, Binus) are primarily for routine interactive work, and Saturn hosts the core web services.
|
|
|
|
|
|
| Server | Status | Threads | Speed | User Space | Data Space |
|
|
|
|--------|--------|---------|-------|------------|------------|
|
|
|
| Axanar | available | 48 | 4GHz | 20TB | 184TB |
|
|
|
| Binus | available | 48 | 4GHz | 20TB | 184TB |
|
|
|
| Cestus | available | 48 | 4GHz | 3.5TB | 230TB |
|
|
|
| Draylax | available | 48 | 4GHz | 3.5TB | 230TB |
|
|
|
| Enara | available | 48 | 4GHz | 3.5TB | 230TB |
|
|
|
| Farias | available | 48 | 4GHz | 3.5TB | 230TB |
|
|
|
| Gothos | available | 48 | 4GHz | 3.5TB | 230TB |
|
|
|
| Harlak | available | 48 | 3GHz | 900GB | 146TB |
|
|
|
| Saturn | available | 48 | 4GHz | 20TB | 126TB |
|
|
|
|
|
|
## Workstations
|
|
|
|
|
|
Workstations are in the process of being updated from CentOS 7 to Rocky Linux 9. The same file systems will be available as will nearly identical software environments.
|
|
|
|
|
|
## [Migration Notes](Migration-Notes)
|
|
|
|
|
|
The current page is about the SPDR environment. Follow the link above for notes on the migration from the legacy systems.
|
|
|
|
|
|
## [Logins](Graphical-Logins)
|
|
|
|
|
|
Follow the link above for information about logging in and especially for information about interactive graphical sessions.
|
|
|
|
|
|
## [Near-Real-Time System Status](https://fosstodon.org/tags/SPDRstatus)
|
|
|
|
|
|
In much the way we used Twitter in the past to track current system issues, we will now use the Mastodon social network link https://fosstodon.org/tags/SPDRstatus. Twitter no longer allows access to content unless you are actively logged-in, making it worthless for our purposes. One advantage of the Fediverse (Mastodon) system is that users other than me @ljg can also report status simply by beginning their post with the string "#SPDRstatus: ". (You can actually have the hashtag anywhere in the message, but let's stick to this convention to make it easier to ignore unrelated potential clashes.) Remember that this is a public forum, so don't divulge anything that would compromise security.
|
|
|
|
|
|
## [File System Organization](File-Organization)
|
|
|
|
|
|
This explains the file system organization and how it should be used, for both project and individual use.
|
|
|
|
|
|
## [AD Group Issues](AD-Group-Issues)
|
|
|
|
|
|
Because of the way Active Directory is configured, there will be new issues to deal with. |
|
|
\ No newline at end of file |