When attempting to boot/install Windows Server 2022, the installer crashes while booting and presents a stop code of DRIVER_IRQL_NOT_LESS_OR_EQUAL relating to storport.sys
Background
I have a set of 4 Cisco UCS C240 M3 servers that I run Hyper-V and Storage Spaces Direct on for a small Hyper-Converged cluster and some supporting infrastructure. Two of the servers serve as the HCI while the other two serve as my VMM and Backup server. Previously, I had been running Windows Server 2019 on all of the servers without any issues. I decided to run Server 2022 through it’s paces and play with some of the new features. I first re-imaged the pair of HCI servers. I created a bootable USB drive (UEFI Non-CSM) with Rufus and easily installed the OS on both servers. I then moved on to the third server, my VMM server, using the same boot drive. The booting screen appeared and before the spinning circle completed two revolutions I got a blue screen.
Troubleshooting
Wondering if something had gone awry with the hardware, I attempted to do the same with the fourth box. Same type of hardware, same result. The only difference between the two sets of boxes is the storage controllers. The HCI boxes make use of HBA controllers whereas the other two boxes make use of RAID controllers (LSI 9271CV-8i to be exact). I first took at look at the firmware. Both controllers were on the same firmware (23.33.1-0060). I downloaded and booted up the Cisco HUU ISO 3.0.4s to check if there was a newer version. Nope. I went ahead and re-flashed the firmware anyway to see if that would resolve the issue. Spoiler, it didn’t. My next thing to try was drivers. Unfortunately, this is a bit of a trickier problem as we can’t just hit F6 and pop in a floppy.
The Fix
In order to fix this issue, I needed to inject the appropriate drivers into the wims. To do this, I needed to inject the drivers into the boot.wim and install.wim from the media. Luckily, since I was using a USB drive to do the install, I could just swap out the wims on the flash drive. If you are using a CD, you’ll need to re-create the ISO using the utility of your choice.
First, we need to setup a quick folder structure. I created a set of folders in C:\temp and named them Mount, Drivers, and Wims. The names aren’t important, but rather, their purpose.
2. Next, we need to mount the Server 2022 ISO and copy the install.wim and boot.wim from the sources directory of the media to the Wims folder in our temp directory. You can do it via clicking through the GUI, or via PowerShell
4. Next, we’ll need to accquire our drivers. Once you’ve got all the drivers you need, copy them to the Drivers folder like so. If you have multiple drivers, it’s a good idea to store each set in it’s own folder.
6. Now we need to identify which editions of Windows Server we’re going to add our drivers to. If you know you’re only going to ever deploy Standard (Desktop Experience), you can opt to add the drivers to just that index.
7. At this point, all we need to do is copy the two wims to the sources folder on our installation media. Once we do that, we should get past the BSOD.
Installation was successful, and I now have Server 2022 running on all of the servers in my lab.
Recently, I’ve been looking at adding a desk pad to my desk to protect the wood. One of my requirements was that I wanted genuine leather, not the PU crap or other synthetic materials that many sellers on Amazon try to pawn off as leather. Wading through the seemingly endless list of junk, I did find quite a few listings that claimed to be real leather. The next part was trying to find one that was the size that I needed. I wanted something at least 36in (914mm) x 13in (330mm). That narrowed down the pickings, but also upped the price with most units coming in at $80 or more. I thought to myself, I can make this cheaper. So I did… Sorta.
The Requirements
Just like any proper project, I started out forming a list of requirements. If I was going to go the custom route, it had betted damn well check all of my boxes.
14in (355mm) x 40in (1016mm)
Top Grain or Full Grain Leather (Black or Brown)
Integrated PowerPlay
The Supplies
Thankfully, the supplies list for this is pretty short. I just needed some leather, some foam backing, and something to glue the two together with. For the leather, I opted for a side of chrome tanned, black stoned oil finish leather from Tandy Leather. This ran me $150. Yes, it was more expensive than buying a desk pad off of Amazon, but I can get up to 4 desk pads out of one side and still have some leftover. My side came in right at 19sqft, which makes it about $7.90/sqft. At that price, it puts the cost of my pad at $30.63. Not bad. Next was the backing. Since I was going to be putting this on top of my PowerPlay pad, I needed something under the leather to keep things level. I didn’t want a bump or indication of where the charging pad was. First step was to measure the thickness of the charging pad. According to my calipers, it came in at 2.3mm. I opted to go with some 2mm thick EVA foam. My local craft store (Michaels) had a roll of 36″x60″ for $9. Close enough. I also needed a way to adhere the two together. Looking through their glue, my options were limited for something that indicated that it bonded leather and foam. I ended up going with Gorilla Spray Adhesive for $13 as it indicated on the label that it could bond both materials.
The Build
I started by laying out the leather and making my cut lines. I used a long level and framing square to make sure I was cutting a proper rectangle as opposed to a rhombus or un-named yet to be discovered shape.
I used an X-Acto knife and put a cutting board beneath the leather while making the cuts. I cut from the top side of the leather to ensure that I had a nice clean edge (and it’s easier to mark). Next, I rolled out the foam and placed the leather on top to begin marking stuff out and ensure I had a decent margin.
Conclusion
If you consider only the amount of materials that I used to make this, the build cost comes in at about $40. Not bad at all. Timewise, it was a rather light project taking about an hour to craft, most of which was planning out cuts and such before actually cutting. You may notice that there are a few wrinkles in the above photo. These will smooth out over time, and after setting a hot cup of coffee on the left side this morning, it is pretty much completely flat. I have to say, I’m pretty happy with the result.
Power BI is a great tool to visualize data and create effective interactive dashboards and reports. Log Analytics is great to gather and correlate data. Naturally, the two are a great pair. While there isn’t yet a native connector for Log Analytics, you can still pull data by writing a custom M Query in Power BI. Thankfully, with just a couple clicks from Log Analytics, it will generate everything for you so that you don’t need to know M Query to pull data. When doing this, you’ll need to login to Azure using an Organization Account or another interactive login method. But what if you need to have a report that doesn’t require you to login every time? Using an App Registration and authenticating via OAuth can accomplish this, but how do we do that in PowerBI?
Security Disclaimer
The method that I’m going to show you stores the API Key in the dataset/query. This is meant as more of a PoC than something that you would use in any sort of a production environment unless you can protect that data. If you’re putting this report on a PowerBI server, you’ll want to make sure people cannot download a copy of the report file as that would allow them to obtain the API key.
Setup
Before we try to query Log Analytics from Power BI with OAuth, we need to setup an App Registration.
Log into Azure and open App Registrations either from the main portal or via Azure Active Directory. Click on the New Registration button, give it a name and a Redirect URI.
2. Next, generate a new client secret.
3. Now we need to grant access to the Log Analytics API (Data.Read). To do this, click Add Permission select “APIs my Organization Uses” and search for Log Analytics.
Once you’ve added the permission, you need to grant admin consent to the API to interact on the users behalf.
We now have the App Registration almost ready to go. What we’ve done is grant the App Registration the ability to query the Log Analytics API. What we haven’t don yet is grant it access to our Log Analytics workspaces. We’ll take care of that next.
4. Browse to your Log Analytics workspace. We’ll need to add our App Registration to the Log Analytics Reader role. This will grant allow the App Registration the ability to query any table within this workspace. If you want to limit the tables that the app registration is able to query, you will need to define a custom role and assign them to that role instead. I won’t cover creating a custom role in this post, but you can read about how to create a custom role here and see a list of all the possible roles here. You may also want to read through this article to read about the schema for the JSON file that makes up each custom role.
5. Now that the permissions have been granted, let’s run a query. Once you have some results, click on Export and then Export to Power BI (M query). This will download a text file that contains the query. Go ahead an open it up.
The text file will look something like this. If we were to put this into Power BI as is, we could get our data after authenticating with our Organization Account. We’ll take the output and customize it just a bit.
6. Using OAuth is a two step approach. This is commonly known as Two-Legged OAuth where we first retrieve a token, and then use that token to execute our API calls. To get the token, add the following after the first line:
let ClientId = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
ClientSecret = Uri.EscapeDataString("xxxxxxxxxxxxx"),
AzureTenant = "xxxxxx.onmicrosoft.com",
LogAnalyticsQuery = "Usage
| where TimeGenerated > ago(1d)
",
OAuthUrl = Text.Combine({"https://login.microsoftonline.com/",AzureTenant,"/oauth2/token?api-version=1.0"}),
Body = Text.Combine({"grant_type=client_credentials&client_id=",ClientId,"&client_secret=",ClientSecret,"&resource=https://api.loganalytics.io"}),
OAuth = Json.Document(Web.Contents(OAuthUrl, [Content=Text.ToBinary(Body)])),
This bit will setup and fetch the token. I’ve split out the Log Analytics query to make it easier to reuse the bit of code for additional datasets. Obviously, you’ll want to put your ClientId, ClientSecret, and Azure Tenant on the first three lines. After that, you’ll want to edit the source line. Change it from:
let Source = Json.Document(Web.Contents("https://api.loganalytics.io/v1/workspaces/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/query",
[Query=[#"query"="Usage
| where TimeGenerated > ago(1d)
",#"x-ms-app"="OmsAnalyticsPBI",#"prefer"="ai.response-thinning=true"],Timeout=#duration(0,0,4,0)])),
You may notice that we did two things. First, we changed the query to use our variable instead of the text itself. This is done purely to make it easier to re-use the code for additional datasets. The other thing we’ve done is add the Authorization header to the request using the token we obtained from the OAuth line. You should now have something like this:
Let’s put that into our query in Power BI. Open Power BI, create a new Blank Query and open the Advanced Editor.
One last thing we will need to do is set our Authentication method to Anonymous. You should be prompted to do so after clicking done, but if not, or you got click happy and dismissed it, you can click on Data Source Settings -> Edit Permissions. If you already have other data sources in your report, be sure to select the one that we just created and change the credentials to Anonymous.
We should now have a preview of our data
Now, if we need to add additional queries, it’s as simple as duplicating the dataset and changing the LogAnalyticsQuery variable on that dataset to the new query.
Process control is one of the most important aspects of brewing. Sure, I’ve had some great beers that were brewed by people just flopping stuff into a pot. However, without process control, the results are not consistently repeatable. When we talk about process control in brewing, we are usually talking about temperature. Temperature is really everything in beer making and, all else the same, can produce very distinct beers. Temperature can affect just about every characteristic of a batch of beer such as fermentability, mouth feel, flavors, aromas, etc. In this blog post, I’m going to focus on temperature during fermentation and building a controller to do just that. I am going to focus on the hardware here and will do a follow-up post with details about the software. The software that I wrote for this is available on my GitHub space.
How it Works
The fermentation controller works by regulating the temperature during, well, fermentation. There are two outputs on the device, one for a heat source, and one for a cold source. For the cold source, I have a large chest freezer that is large enough to hold four 6 gallon carboys, or a single 60L Spiegel fermenter. When I need to drop the temperature, I trigger the freezer to turn on. Heat is provided using a 100w ceramic reptile bulb. These produce heat, but no light. This is perfect for our use as that’s what we want, heat, no light. By placing everything in a chest freezer, we have a built in cold source, it blocks out light, and it’s insulated so it holds temperature (hot or cold) really well. I’ve been using a Tilt Hydrometer for quite some time along with either their mobile app, or just a Raspberry Pi in a case that I could connect to for viewing the data. With this though, I wanted to have a touchscreen to interact with it and view data or, check on the fermentation away from home.
Parts List
First thing we want to do is gather all of the components that we’ll need for our build. I’ve listed everything out here along with where I bought them and price at time of purchase. There’s a link to the spreadsheet below.
Assembly starts with putting together the Raspberry Pi and DAQC Plate. Sensors and other items can be connected to the Raspberry Pi via the SPI header. The SPI header utilizes a 3.3v signal to determine state and communicate with connected devices. The Pi unfortunately does not have a built in ADC or Analog/Digital Converter. For this, we utilize the DAQC Plate by Pi-Plates to handle all of our sensors and take care of any analog to digital conversions we may need. Pi-Plates are stackable and have a ready built python library that we can use which makes it very attractive in our case. Now, we could also do this build using an arduino or other misc IoT type of board, however, Tilt comes with a ready-built Raspbian based image that is perfect for what we want to do and doesn’t require us to write our own Bluetooth functions.
Above we have the touchscreen, the Pi, the DAQC Plate, and the Case Plate. The mounting holes on the DAQC plate do not line up with the ones on the Pi, so the Case Plate makes mounting a lot easier and helps protect the contacts and provide support for the hardware.
Now you might be wondering about how I got the screen to look the way it does. That’ll be covered in my next blog post, but I have uploaded the source code to GitHub if you want to rummage through it. There’s a couple things we need to do to get it to work from the Pi side, but I’ll cover all of that in the next post. Here’s what our data looks like: