New Class, Here I Come ASP.Net

Back to expanding my knowledge with a new class at DeVry. This time, developing an ASP.Net website with C#. Looking forward to this as this will directly help me with an ASP.Net site that I took over as admin last year. Fortunately, I still have a few developers who are familiar with the code; but I can’t say it has been easy trying to figure it out only having experience setting up basic websites just with HTML. Granted, the site at work uses Visual Basic; but the the underlying idea is the same. Main page file that defines the page, and and an underlying file that controls the more advanced programming logic.

The first few chapters I had to read helped out immensely as now I understand how ASPX files actually work with the two files. That is what has honestly been throwing me off this whole time with the website at work. I would see the top ASPX file and kept wondering how the hell it actually worked. Now I know. Granted, I should have been able to figure this out as it isn’t that difficult, I just never had time at work since I still had two developers and had other tasks needing to be completed. Now I can better understand how the hell it works.

Back to my class though. Got through the first week without any issue. The class is setup to have a final project that we progressively build throughout the course each weak via the lab assignments. First lab was basically to get familiar with Web Forms by creating two. One that just displayed simple text and the other to have a very basic salary calculator. Since that wasn’t to bad, I spent the rest of the time setting up an Azure DevOps project to track this course project. I’ve got each week setup as an Sprint on my Scrum board and I have a the code setup in the project repository so I can control project versioning. Struggled a bit trying to setup a lab branch tied to my Task, so I held off on branching for now. Going to give another go at setting up a dev/LabWork branch this next week so I can better familiarize myself with GIT branching.

This should be fun and I am legitimately looking forward to this.

Unity Mega Bundle Deal

Well, this year isn’t starting out quite like I was hoping. Between dealing with a family emergency and COVID-19, I haven’t had much time to really focus on anything. Thankfully, work is not a current concern as I can work remote.

However, I’m still trying to keep in mind my goal of learning how to use Unity through all of this. Apparently, the tech gods decided to help throw me a bit of a bone to help keep me going. Spotted what to me looked like a fairly good deal that Unity shared via a LinkedIn post. They are offering a $1000 Mega Bundle for 90% off. Includes a one year subscription to their Unity Learn Premium service ($99 by itself) and a load of assets to use in Unity.

Figured what the hell and picked it up. Now to make sure I do something with it. Offer is good until the end of March in case anyone else is interested.

Grow Your Skills Mega Bundle

SQL : “String or binary data would be truncated” Error

While supporting a number of MS SQL servers that I took over as DBA this year, I ran into a scheduled job that handles a nightly data feed started failing. After reverse engineering exactly how this job worked, I found a specific stored procedure failing with the “String or binary data would be truncated” error.

What does this mean? Basically, the process is trying to run an insert query and is trying to insert data into a column that is not big enough. For example, say I have a column defined as type nvarchar(3) and I’m trying to insert ‘testing’ into that column. This will generate this error as ‘testing’ is 7 characters and that column is limited to 3.

Solution! I have two options, ignore the error and let SQL truncate the data; or I can find the offending data increase the size of the destination column.

Ignoring the error by adding “SET ANSI_WARNINGS OFF” to the query will get me around the error; but be warned it will truncate the data and is not recommended.

This leaves increasing the size of the destination column. However, when my source data has over 100k rows of data and my destination table has over 35 columns, that is easier said then done. I could just increase the size of every column in my destination table; but that is overkill in my case. I know there are likely more elaborate ways of going about finding the offending data; but due to time, I needed to find the offending data quick. While this query is very simple, it worked perfectly to help me find the offending data:

select top 1 COLUMN
from table t
order by len(COLUMN) desc
Just changed “COLUMN” to the name of each column of data I need to check until I found all offending data. Very simple and easy to tweak as needed. I found plenty of complex solutions; but sometimes all it takes is a simple solution.

Azure Update

Okay, so I started an Azure account, created a Web app project tied to Azure Dev Ops, and I just finished deleting the project.  While I admit, I haven’t had the time to really dedicate to it; but the decision game to cost. Should have done better research. I have the free credit, but at the rate I was burning through it, it wouldn’t have lasted long.

Going to count my loses and regroup once I can better understand the cost being each service. I still have a fair amount of my credit left so it isn’t so bad. Just need to make sure I have time to delve deeper. For now, AWS.