In the near future, you will administer your IBMi LPAR by putting on your headset and entering the Metaverse…

This is just a late-night “riff” of creative writing. A very raw draft of a storyline of one possible version of the future……

===================================

It’s about 8:30 AM, and my work day begins. As a system administrator for a large insurance company, I’m responsible for keeping everything running in our various data centers. Typically I login to my laptop and scan various dashboards and trouble ticket systems that report when something isn’t working right. There would be an alert or log entry describing some problem. The problems could come from any of our data centers around the world. I login to those remote systems and try to figure out is was wrong and attempt to fix it. When needed, other co-workers assist depending on the problem and what skill set might be needed to fix it. Sometimes an application breaks, or maybe a network connection stops working. Occasionally some piece of hardware goes bad. I spend my whole day looking at log files and error messages, emailing, talking, or chatting with other co-workers using collaboration tools like Slack and Microsoft Teams, emailing, and calling hardware and software vendors whose products my company uses. 

That was how it was done in the past, before the Metaverse. At my company, we just call it “The Met”.

Now at 8:30 AM, I slip on my Metaverse headset. In each hand I hold a controller, functioning as a combination of joystick and keyboard, but each controller was custom molded to the shape of my hand. Similar in size to a tennis ball, my fingers naturally rest on small buttons positioned around each ball. It only takes a little bit of pressure with the tip of my finger to activate the buttons. 

When I first put on my headset, my field of vision changes. The headset blocks out all light from the outside world and instead I see what looks like a computer game, except unlike looking at a traditional computer monitor, it feels like I’m “in” the game. When I turn my head or look up, down, left, or right I still see the game all around me. I also see my “game hands”. Because I’m holding the controller balls in each hand, those are electronically connected to my headset. So as I move my real hands, my game hands move in a similar way. My presence in the virtual world is my Avatar (borrowed from the movie with the same name).

I’m standing in what we call “The Lobby”. When you first put on and activate your headset, the initial world you see is an empty space similar to being inside a large room, but nothing is in it. The only thing you see is an office-sized door on one wall that says “Login” in letters that look like the “Exit” sign you see in a lot of office buildings.

The Lobby is a space that you can use to get oriented to the virtual world before you actually enter. Some people get a dizzy feeling when they first put on the headset. It takes a few minutes for your brain to adjust to “seeing” in the Met, but not seeing the real world. If you ever watched the movie “The Matrix”, you could jack into the “Construct” which was a loading area that you entered before actually going into the Matrix. The Lobby is something like that.

I’ve done this so many times before, I need very little adjustment time. I slightly squeeze the controller in my left hand (I’m left-handed). I move my real hand slightly forward. In the Met, I appear to move forward toward the door. By squeezing the ball and moving my hand I can move in any direction, but stay oriented as if I’m standing up and walking around. I can move fast or slow, turn left or right. I go to the login door in just a few seconds and reach out and turn the handle. The door opens.

I’m able to open the door because as soon as I put on the headset, the eye scanners in the headset track the movements of each of my eyes, scanned my retina, and determined who I was, and that determines what I see next when I open the door. Depending on who you are and what role you have in our company, the door leaving the Lobby takes you to the place where you can do your actual job.

For me, I’m now standing in what looks like a space filled with computer servers. But it is not like a square room in a standard building. I don’t see any solid walls, floor, or ceiling. It is like you are standing in a normal building, but all the floors are walls are transparent. If I look straight ahead, I see rows of computer servers. If I look downward, I see through the floor to the next level. On that floor below me are other rows of computer servers. If I look up, I see through the ceiling and see other servers above me. In all directions, I see neatly arranged “racks” of computer servers that appear to be floating in space, which I’m part of.

Each of the computer servers that I see in the Met is connected to the real equivalent computer out in the real world. So if I’m in the Met, and I turn off the power of a computer server, the power to that server out in the real world also goes off. Not all, but many objects in the Met are connected or “mirrored”. If I’m interacting with a mirrored computer in the Met, then I’m doing the same thing to the real computer server that it is mirrored too. Not all things are mirrored, but many are. For example, I might walk up to a goldfish tank in the Metaverse and sprinkle some food into it. If that fish tank is “mirrored”, then out in the real world, an automated food dispenser on the real fish tank will feed the fish in it.

In the field of view in my headset, there is a “Heads Up Display” (HUD). This looks like a traditional laptop display, but it is semi-transparent and appears to be “floating” just in front of me at the same distance that a normal computer screen would be from your face. On the HUD there are things similar to what you might see on a traditional computer screen, but more 3D since I’m in the Met. In the second Matrix movie, there is a scene where <the ship> returns to <Zion>. the operators of the doors to enter Zion are in what appears to be a white computer room. The operators are sitting in white chairs, dressed in all white, looking at transparent computer screens controlling the giant doors to Zion. The HUD that I see is similar to that.

On the HUD there is something that looks like a list with some text. I stare at it, and if I blink just my left eye (I’m also left eye dominant), the eye trackers in my headset detect that as a command.  The list visually expands so I can easily read it. This is the list of outstanding problem/trouble tickets that fit my areas of expertise. There are 10 entries, they are sorted by priority

The first one says:

“DC7, Switch 2232, Network Performance Slowdown.” Priority 1

The second one:

“DC1, Server 42, LPAR 17, IBMi, Scheduled Job did not complete.” Priority 1

The third one:

“DC2, Server 135, LPAR 332, AIX, RootVG disk space low.” Priority 2

DC stands for Data Center. My company has 16 data centers in different places around the world. I’m physically in Phoenix, Arizona where I live. In the Met, when I first went through the login door from the Lobby, I entered DC1, which is our largest data center in Arizona where most of our support problems occur. DC7, in the first entry on my HUD, is in Hong Kong.

I’m about to do something that we call “Transport”, borrowed from Star Trek. I need to take my “virtual self” from DC1 where I am now, to DC7 in Hong Kong so I can investigate the first problem on my list. 

I stare at the first item on the list in my HUD and blink my left eye. My field of view fades away similar to how people on Star Trek “fade away” when they transport. It takes about 1 second, my field of view “fades back in”, and now I’m at DC7. In that 1 second, it took for me to transport from DC1 to DC7, several things happened. Most of that 1-second fade-out, fade-in time is just so that I don’t get physically disoriented from switching between spaces with the Met. You can adjust how long it takes for your transport to complete. Some people require more time, some less. In a fraction of that second, the system also verified I’m able to access DC7, and what level of access I have. Things that I don’t have authorization for, I don’t even see in my view of DC7. Also, the real data center staff in Hong Kong are notified that I’ve virtually entered. I might need a physical person in the Hong Kong data center in order to fix the problem.

In my company’s Metaverse, we have about 10,000 “spaces”. These are locations like virtual buildings, data centers, or even single office rooms, outside areas, gardens, and meeting halls. Spaces are in different locations around the world. You can transport to any space you are authorized to access. Or you can just “walk-around” in a space like you would walk around in any building. Other people in the Met that are also walking around see your Avatar, you see theirs. This is like the experience you get in Roblox, but the resolution is much higher and looks semi-realistic.

Your Avatar can be customized to anything, but since I’m in my company Met, my Avatar has to follow their acceptable guidelines. When I’m working in my company’s Met, I have my corporate Avatar that resembles my real person (with a few improvements). Depending on what Metaverse you are in you can have a different Avatar for that specific Met.

DC7 looks slightly different than DC1. In DC1, everything is tinted blue in some way. In DC7, everything is tinted in slight red color.

I’m on the main floor of DC7. I still see my HUD with my task list for the day. I reach up with my left metaverse hand and touch the first entry on the list. My avatar presence floats up 3 levels and moves down one of the rows at a speed about 3 times as fast as you would normally walk. This “speed walk” is designed to also not disorient the real person. The Met is constructed so it somewhat resembles the physical world it is connected to, but not exactly. In the Met, you have “superpowers” where you can walk very fast, transport, jump great distances, and other things like that.

I’m now standing in front of a large cabinet of computer servers called a “rack”. In the rack are several servers. I reach out and open the rack door to access the problem server on my list. This action is also logged in the security system. My security access to this rack is checked and within milliseconds of me touching the rack handle, the door opens. If I did not have access to this rack, an alarm would have been sounded inside the Met, and out in the real world as well. My avatar would be frozen at the place where I’m standing in DC7, and my manager would be notified.

All of the computer servers in DC7 are mirrored to real servers. Just about every type of computer server, or network component these days have “Metaverse API Modules”. The API is an interface that connects the virtual computer to the real one. If I’m in the Met and make a change to the virtual computer server, the same change is made to the real computer.

Now that the rack door is open I see the network switch labeled 2232. It is the 3rd one down. The servers in the rack are stacked on top of each other, each about 2 inches tall. If I stare at the 2232 device and left-blink, a new HUD panel appears in my field of view. This HUD lists out all the vital statistics of the device. Highlights any errors from the logs, and shows the history of all changes made to the device.

The story will continue……