Tag: github

  • Migrating to Github Packages

    I have been running a free version of Proget locally for years now. It served as a home for Nuget packages, Docker images, and Helm charts for my home lab projects. But, in an effort to slim down the apps that are running in my home lab, I took a look at some alternatives.

    Where can I put my stuff?

    When I logged in to my Proget instance and looked around, it occurred to me that I only had 3 types of feeds: Nuget packages, Docker images, and Helm charts. So to move off of Proget, I need to find replacements for all of these.

    Helm Charts

    Back in the heady days of using Octopus Deploy for my home lab, I used published Helm charts to deploy my applications. However, since I switched to a Gitops workflow with ArgoCD, I haven’t published a Helm chart in a few years. I deleted that feed in Proget. One down, two to go.

    Nuget Packages

    I have made a few different attempts to create Nuget packages for public consumption. A number of years ago, I tried publishing a data layer that was designed to be used across platforms (think APIs and mobile applications), but even I stopped using that in favor of Entity Framework Core and good old fashioned data models. More recently, I created some “platform” libraries to encapsulate some of the common code that I use in my APIs and other projects. They serve as utility libraries as well as a reference architecture for my professional work.

    There are a number of options for hosting Nuget feeds, with varying costs depending on structure. I considered the following options:

    • Azure DevOps Artifacts
    • Github Packages
    • Nuget.org

    I use Azure DevOps for my builds, and briefly considered using the artifacts feeds. However, none of my libraries are private. Everything I am writing is a public repository in Github. With that in mind, it seemed that the free offerings from Github and Nuget were more appropriate.

    I published the data layer packages to Nuget previously, so I have some experience with that. However, with these platform libraries, while they are public, I do not expect them to be heavily used. For that reason, I decided that publishing the packages to Github Packages made a little more sense. If these platform libraries get to the point where they are heavily used, I can always publish stable packages to Nuget.org.

    Container Images

    In terms of storage percentage, container images take up the bulk of my Proget storage. Now, I only have 5 container images, but I never clean anything up, so those 5 containers are taking up about 7 GB of data. When I was investigating alternatives, I wanted to make sure I had some way to clean up old pre-release tags and manifests to keep my usage down.

    I considered two alternatives:

    • Azure Container Registry
    • Github Container Registry

    An Azure Container Registry instance would cost me about $5 a month and provide me with 10 GB of storage. Github Container Registry provides 500MB of storage and 1GB of Data transfer per month, but that is for private repositories.

    As with my Nuget packages, nothing that I have is private. Github packages are free for public packages. Additionally, I found a Github task that will clean up Github the images. As this was one of my “new” requirements, I decided to take a run at Github packages.

    Making the switch

    With my current setup, the switch was fairly simple. Nuget publishing is controlled by my Azure DevOps service connections, so I created a new service connection for my Github feed. The biggest change was some housekeeping to add appropriate information to the Nuget package itself. This included added the RepositoryUrl property on the .csproj files. This tells Github which repository to associate the package with.

    Container registry wasn’t much different, and again, some housekeeping in adding the appropriate labels to the images. From there, a few template changes and the images were in the Github container registry.

    Overall, the changes were pretty minimal. I have a few projects left to convert, and once that is done, I can decommission my Proget instance.

    Next on the chopping block…

    I am in the beginning stages of evaluating Azure Key Vault as a replacement for my Hashicorp Vault instance. Although it comes at a cost, for my usage it is most likely under $3 a month, and getting away from self-hosted secrets management would make me a whole lot happier.

  • Git Out! Migrating to GitHub

    Git is Git. Wherever it’s hosted, the basics are the same. But the features and community around tools has driven me to make a change.

    Starting Out

    My first interactions with Git happened around 2010, when we decided to move away from Visual SourceSafe and Subversion and onto Git. At the time, some of the cloud services were either in their infancy or priced outside of what our small business could absorb. So we stood up a small Git server to act as our centralized repository.

    The beauty of Git is that, well, everyone has a copy of the repository locally, so it’s a little easier to manage the backup and disaster recovery aspects of a centralized Git server. So the central server is pretty much a glorified file share.

    To the Cloud!

    Our acquisition opened up access to some new tools, including Bitbucket Cloud. We quickly moved our repositories to Bitbucket Cloud so that we could decommission our self-hosted server.

    Personally, I started storing my projects in Bitbucket Cloud. Sure, I had a GitHub account. But I wasn’t ready for everything to be public, and Bitbucket Cloud offered unlimited private repos. At the time, I believe GitHub was charging for private repositories.

    I also try to keep my home setup as close to work as possible in most cases. Why? Well, if I am working on a proof of concept that involves specific tools and their interaction with one another, it’s nice to have a sandbox that I can control. My home lab ecosystem has evolved based on the ecosystem at my job:

    • Self-hosted Git / TeamCity
    • Bitbucket Cloud / TeamCity
    • Bitbucket Cloud / Azure DevOps
    • Bitbucket Cloud / Azure DevOps / ArgoCD

    To the Hub!

    Even before I changed jobs, a move to GitHub was in the cards, both personally and professionally.

    Personally, as a community, I cannot think of a more popular platform than GitHub for sharing and finding open/public code. My GitHub profile is, in a lot of ways, a portfolio of my work and contributions. As I have started to invest more time into open source projects, my portfolio has grown. Even some of my “throw away” projects are worth a little, if only as a reference for what to do and what not to do.

    Professionally, GitHub has made a great many strides in its Enterprise offering. Microsoft’s acquisition only pushed to give GitHub access to some of the CI/CD Pipeline solutions that Azure DevOps has, coupled with the ease of use of GitHub. One of the projects on the horizon at my old company was to identify if GitHub and GitHub actions could be the standard for build and deploy moving forward.

    With my move, we have a mix of ecosystem: GitHub + Azure DevOps Pipelines. I would like to think, long term, I could get to GitHub + GitHub Actions (at least at home), the interoperability of Azure DevOps Pipelines with Azure itself makes it hard to migrate completely. So, with a new professional ecosystem in front of me, I decided it was time to drop BitBucket Cloud and move to GitHub for everything.

    Organize and Move

    Moving the repos is, well, simple. Using GitHub’s Import functionality, I pointed at my old repositories, entered my BitBucket Cloud username and personal access token, and GitHub imported it.

    This simplicity meant I had time to think about organization. At this point, I am using GitHub for two pretty specific types of projects:

    • Storage for repositories, either public or private, that I use for my own portfolio or personal projects.
    • Storage for repositories, all public, that I have published as true Open Source projects.

    I wanted to separate the projects into different organizations, since the hope is the true Open Source projects could see contributions from others in the future. So before I started moving everything, I created a new GitHub organization. As I moved repositories from BitBucket Cloud, I put them in either my personal GitHub space or this new organization space, based on their classification above. I also created a new SonarCloud organization to link to the new GitHub organization.

    All Moved In!

    It really only took about an hour to move all of my repositories and re-configure any automation that I had to point to GitHub. I setup new scans in the new SonarCloud organization and re-pointed the actions correctly, and everything seems to be working just fine.

    With all that done, I deleted my BitBucket Cloud workspaces. Sure, I’m still using Jira Cloud and Confluence Cloud, but I am at least down a cloud service. Additionally, since all of the projects that I am scanning with Sonar are public, I moved them to SonarCloud and deleted my personal instance of SonarQube. One less application running in the home lab.

  • SonarCloud has become my Frank’s Red Hot…

    … I put that $h!t on everything!

    A lot has been made in recent weeks about open source and its effects on all that we do in software. And while we all debate the ethics of Hashicorp’s decision to turn to a “more closed” licensing model and question the subsequent fork of their open source code, we should remember that there are companies who offer their cloud solutions free for open source projects.

    But first, Github

    Github has long been the mecca for open source developers, and even under Microsoft’s umbrella, that does not look to be slowing down. Things like CI/CD through Github Actions and Package Storage are free for public repositories. So, without paying a dime, you can store your open source code, get automatic security and version updates, build your code, and store build artifacts all in Github. All of this built on the back of a great ecosystem for pull request reviews and checks. For my open source projects, it provides great visibility into my code and puts MOST of what I want in one place.

    And then SonarQube/Cloud

    SonarSource’s SonarQube offering is a great way to get static code analysis on your code. While their community edition is missing features that require an enterprise license, their cloud offering provides free analysis of open source projects.

    With that in mind, I have started to add my open source projects to SonarCloud.io. Why? Well, first, it does give me some insight into where my code could be better, which keeps me honest. Second, on the off chance that anyone wants to contribute to my projects, the Sonar analysis will help me quickly determine the quality of the incoming code before I accept the PR.

    Configuring the SonarCloud integration with Github even provides a sonarcloud bot that reports on the quality gate for pull requests. What does that mean? It means I get a great picture of the quality of the incoming code:

    What Next?

    I have been spending a great deal of time on the Static Code Analysis side of the house, and I have been reasonably impressed with SonarQube. I have a few more public projects which will receive a SonarCloud instance, but at work, it is more about identifying the value that can come from this type of scanning.

    So, what is that value, you may ask? Enhancing and automating your quality gates is always beneficial, as it streamlines your developer work flow. It also sets expectations: Engineers know that bad/smelly code will be caught well before a pull request is merged.

    If NOTHING else, SonarQube allows you to track your testing coverage and ensuring it does not trend backwards. If we did nothing else, we should at least ensure that we continue to cover what we write new, even if those before us did not.

  • Talk to Me Goose

    I’ve gone and done it: I signed up for a trial of Github Copilot. Why? I had two driving needs.

    In my work as an architect, I do not really write a TON of code. When I do, it is typically for proof of concepts or models for others to follow. With that in mind, I am not always worried about the quality of the code: I am just looking to get something running so that others can polish it and make it better. So, if Copilot can accelerate my delivery of these POCs and models, it would be great.

    At home, I tinker when I can with various things. Whether I am contributing contributing to open source projects or writing some APIs to help me at home, having a little AI companion might be helpful.

    My one month experiment

    Github Copilot offers a free thirty day trial, so I signed up. Now, unfortunately, because I did not have a Github Enterprise account, I do not have access to Copilot for Business. Since that has privacy guarantees that Copilot for Individuals does not have, I kept Copilot on my home machine.

    In spite of this, I did sufficient work in the 30 days to get a pretty good idea of what Copilot has to offer. And I will say, I was quite impressed.

    Intellisense on Steroids

    With its integration to VS Code and Visual Studio, Copilot really beefs up intellisense. Where normal intellisense will complete a variable name or function call, Copilot will start to suggest code based on the context in which I am typing. Start typing a function name, and Copilot will suggest the code for the function, using the code around it as reference. Natural language comments are my favorite. By adding a comment like “bubble sort a generic list,” Copilot will generate code for the comment.

    Head to Head!

    As I could not install Copilot on my work machine, I am essentially running a head-to-head comparison of “Copilot vs No Copilot.” In this type of comparison, I typically look for “help without intrusion,” meaning that the tool makes things faster without me knowing it is there. By that standard, Copilot passes with flying colors. On my home machine, it definitely feels as though I am able to generate code faster, yet I am not constantly “going to the tool” to get that done. The integration with Visual Studio and VS Code is very good.

    That said, the only official IDEs supported are Visual Studio, VS Code, VIM/NeoVIM, and the Jetbrains IDEs. That last one is in beta stages. I anticipate more support the tool matures, but if you are using one of those IDEs heavily, I highly recommend giving Copilot a shot. Everyone needs a Goose.

  • Badges… We don’t need no stinkin’ badges!

    Well… Maybe we do. This is a quick plug (no reimbursement of any kind) for the folks over at Shields.io, who make creating custom badges for readme files and websites an easy and fun task.

    A Quick Demo

    License for spyder007/MMM-PrometheusAlerts
    Build Status for spyder007/MMM-PrometheusAlerts

    The badges above are generated from Shields.io. The first link looks like this:

    https://img.shields.io/github/license/spyder007/MMM-PrometheusAlerts

    My Github username (spyder007) and the repository name (MMM-PrometheusAlerts) are used in the Image URL, which generates the badge. The second one, build status, looks like this:

    https://img.shields.io/github/actions/workflow/status/spyder007/MMM-PrometheusAlerts/node.js.yml

    In this case, my Github username and the repository name remain the same, but node.js.yml is the name of the workflow file for which I want to display the status.

    Every badge in Shields.io has a “builder” page that explains how to build the image and even allows you to override styles, colors, and labels, and even add logos from any icon in the Simple Icons collection.

    Some examples of alterations to my build status above:

    “For the Badge” style, Bugatti Logo with custom color
    Flat style, CircleCI logo, Custom label

    Too many options to list…

    Now, these are live badges, meaning, if my build fails, the green “passing” will go to a red “failing.” Shields.io does this by using the variety of APIs available to gather data about builds, code coverage, licenses, chat, sizes and download counts, funding, issue tracking… It’s a lot. But the beauty of it is, you can create Readme files or websites which have easy to read visuals. My PI Monitoring repository‘s Readme makes use of a number of these shields to give you a quick look at the status of the repo.