Taskmatics Scheduler 1.1 Released

We’ve released version today which contains many performance improvements as well as some new features that were requested by the community. This means that if you download the installer today you’ll be getting the new version, and we’ve updated the Taskmatics Scheduler package on the Nuget repository with a new version. The primary focus of the build was to improve system-wide performance. In cases where there are a significant number of tasks that run at high frequencies we’ve seen some slowness. With our changes we’ve come to see improvements of between 80% to 90% in terms of time. This isn’t limited to the administration UI screens, but also applies to the reports as well.

How to get it

To get the installer, go here!

What’s Inside

The full list of changes in this release are as follows:

  • Added ability to set exclusion dates in the calendar trigger on which the scheduled task will not run.
  • Fixed an intermittent trigger loading error when adding a trigger to an existing scheduled task.
  • Added performance improvements to all reports.
  • Added performance improvements to dashboard loading.
  • Added performance improvements to task dispatching.
  • Added performance improvements to task history search.
  • Database schema changes to help improve overall system performance.
  • Fix for intermittent out of memory exception when querying resource usage.
  • Added filter to the Dashboard screen to allow users to find scheduled tasks that contain the entered term (similar to filter on other listing screens).
  • Added additional error handling and update messaging to the installer process.

Breaking Changes

While it’s always a goal to be fully backwards compatible, this release does contain one breaking change that may require a rebuild of one or more tasks. In the past, each Task instance was identified with a Globally Unique Identifier, or GUID. These identifiers are great at ensuring that you can have billions of tasks that don’t collide with each other but the downside to their guarantee of uniqueness is that they also are not as performant as using sequential integers. As part of the new release, we’ve re-mapped all task instances to use straight 64 bit integers, which gives us the same uniqueness guarantee, but also performs much better under search.

You should update all of your task libraries to reference the Taskmatics.Scheduler.Core Nuget package. If any of your tasks reference TaskBase.Context.TaskInstanceId, you will need to rebuild and re-deploy those tasks as the TaskBase.Context.TaskInstanceId property is now represented as Int64 instead of Guid.

Upgrade Instructions

As of now, we don’t have an upgradeable installer, which means that to upgrade a installation of Taskmatics Scheduler, you should first uninstall the previous version and then simply run the new installer and point to the existing database where Taskmatics was originally set up to use. Note: Uninstalling Taskmatics Scheduler will NEVER remove the data in your database. We suggest following these steps to install the new version on your system:

  1. Stop all coordinator and agent Windows services (done from the Services snap-in).
  2. Backup your existing database. This is precautionary in case anything unexpected happens during the upgrade.
  3. Take note of the following information from your current installation:
    • Your serial number, which is in your email you got when you purchased Taskmatics Scheduler. You can also find it in the license.xml file in the ProgramData\Taskmatics\Scheduler directory.
    • Your current database server, database name and the runtime credentials. These can be found in the ProgramData\Taskmatics\Scheduler\Taskmatics.Scheduler.Coordination.ServiceHost.exe.config file.
    • The runtime users for both the coordinator and agent Windows services, which you can see from the services snap-in.
    • The root filesets path and the root working folders path, which are found in the .config files located in the ProgramData\Taskmatics\Scheduler directory.
  4. Uninstall the existing Taskmatics Scheduler components (done from the Programs and Features screen in Windows)
  5. Run the new Taskmatics.Scheduler installer. You will need to re-enter the information you collected in step 3 to make sure that all the original permissions and mappings are re-used.

Due to the nature of the performance improvements and the significant amount of database changes that are in this update, it is possible, depending on the size of your database, that installation of the new version could take upwards of 30 minutes (if you have millions of task instances in your DB). Making the upgrade path smoother for our users is on the top of our roadmap so we don’t expect this to be the norm going forward.

More to Come

This is just the beginning of the changes that we have in store. Thanks again for using Taskmatics Scheduler, and we hope you enjoy working with the new version even more. Let us know what features you’d like to see in subsequent versions of Taskmatics Scheduler as we’re always looking for ideas from the community.

Where was that download link again?

In case you missed it at the top, to get the installer, go here!

How to Host ASP.NET in a Windows Service

Today, I’ll be showing how you can finally host ASP.NET in a Windows service. Yes, with ASP.NET 5, it is possible to host ASP.NET in a Windows service. This article builds on a previous one which shows you how to run a DNX (.NET Execution Environment) application in a Windows Service. It’s a quick one, so go read it now.

Project Dependencies

Once you’ve got the shell project set up using the previous article, you’ll need to add in some dependencies.

Open up project.json and add in a dependencies property with the following entries:

  • "Microsoft.AspNet.Hosting": "1.0.0-beta7" – Bootstraps the web server
  • "Microsoft.AspNet.Server.Kestrel": "1.0.0-beta7" – Web server implementation
  • "Microsoft.AspNet.StaticFiles": "1.0.0-beta7" – Hosts static files
  • "Microsoft.AspNet.Mvc": "6.0.0-beta7" – Includes all the MVC packages

Your project.json should now look like this:

    "version": "1.0.0-*",
    "description": "MyDnxService Console Application",
    "commands": {
        "run": "MyDnxService"
    "frameworks": {
        "dnx451": {
            "frameworkAssemblies": {
                "System.ServiceProcess": ""
    "dependencies": {
        "Microsoft.AspNet.Hosting": "1.0.0-beta7",
        "Microsoft.AspNet.Server.Kestrel": "1.0.0-beta7",
        "Microsoft.AspNet.StaticFiles": "1.0.0-beta7",
        "Microsoft.AspNet.Mvc": "6.0.0-beta7"

Run in Visual Studio or Command Line

It would really be a pain to develop and debug your application while running as a Windows service. That’s not to say you can’t, but if you are currently developing an application, you probably want to run it from Visual Studio or the command line. To do this, you need some sort of switch in the Main method that will either call ServiceBase.Run or just call directly into OnStart and OnStop. Let’s do this by writing our Main method as follows:

public void Main(string[] args)
    if (args.Contains("--windows-service"))


Simply check for the presence of the --windows-service argument, call ServiceBase.Run and you are good to go. Now you can just run and debug your application from Visual Studio by hitting F5. When you want to install the application as a Windows service, don’t forget to pass --windows-service at the end of the binPath= argument.

sc.exe create <service-name> binPath= "\"<dnx-exe-path>\" \"<project-path>\" run --windows-service"

Configure and Start the Server and ASP.NET

Let’s add some namespaces:

using Microsoft.AspNet.Builder;
using Microsoft.AspNet.Hosting;
using Microsoft.AspNet.Hosting.Internal;
using Microsoft.Framework.Configuration;
using Microsoft.Framework.Configuration.Memory;
using Microsoft.Framework.DependencyInjection;
using System;
using System.Diagnostics;
using System.Linq;
using System.ServiceProcess;

Before we start the server, we need to set up a few fields and a constructor on the Program class. We need to store the hosting engine, its shutdown function and a service provider that we will use soon. The IServiceProvider instance will be injected by the DNX runtime.

private readonly IServiceProvider _serviceProvider;
private IHostingEngine _hostingEngine;
private IDisposable _shutdownServerDisposable;

public Program(IServiceProvider serviceProvider)
    _serviceProvider = serviceProvider;

To get the server up and running, we will use the WebHostBuilder class. Your Program.OnStart method should look as follows:

protected override void OnStart(string[] args)
    var configSource = new MemoryConfigurationSource();
    configSource.Add("server.urls", "http://localhost:5000");

    var config = new ConfigurationBuilder(configSource).Build();
    var builder = new WebHostBuilder(_serviceProvider, config);
    builder.UseServices(services => services.AddMvc());
    builder.UseStartup(appBuilder =>

    _hostingEngine = builder.Build();
    _shutdownServerDisposable = _hostingEngine.Start();

There are several steps involved here:

  1. Create and populate the server configuration. (lines 3-6)
  2. Create the builder and tell it what server implementation to use. (lines 7-8)
  3. Configure services using the built-in dependecy injection support. (line 9)
  4. Configure the ASP.NET middleware pipeline. (lines 10-15)
  5. Build and start the server. (lines 17-18)

The previous code is oversimplified for the purposes of this article. We are hardcoding the server URL into an in memory config store, but you can set this up in other ways. See the ASP.NET Configuration repo on Github for other options.

To gracefully shut down the server, implement Program.OnStop as follows:

protected override void OnStop()
    if (_shutdownServerDisposable != null)

Getting Down to Business

Now that we have a server set up and running ASP.NET with static files and MVC, let’s add some content and a controller.

First, create an index.html file and add “Hello world” as the content. Then create a controller file TimeController.cs with the file contents as follows:

using Microsoft.AspNet.Mvc;
using System;

namespace MyDnxService
    public class TimeController
        public DateTime GetTime()
            return DateTime.Now;

That’s all there is to it. Now we can pull up http://localhost:5000 to see “Hello world” and http://localhost:5000/time to see the current time.

The Runtime User and Going to Production

When you install a Windows service, you have to specify a user under which to run the process. If you don’t, “Local System” is the default. Why is this important?

In the previous article we simply ran our Windows service as the default (Local System) user. It turns out we got lucky since we didn’t reference any package dependencies. If we try to do the same thing in this case, the service will quickly fail to start because “Local System” won’t be able to resolve any of the package dependencies we just added.

As you add package references to the project.json file, Visual Studio quietly runs dnu restore in the background and downloads the packages to your user profile (c:\users\<username>\.dnx\packages). When you run dnx.exe, it will resolve any dependencies from your user profile. (You’ll see how to override this in a bit.)

Since “Local System” doesn’t have Visual Studio helping it out, we need to somehow get those packages installed someplace that it can see. Running dnu restore as “Local System” will download the packages to c:\Windows\SysWOW64\config\systemprofile\.dnx\packages (on the 64-bit OS) since there’s no user profile for that account. How can you run dnu restore as “Local System”? This can be done by running psexec.exe -s cmd.exe from Sysinternals (runs a cmd.exe console as “Local System”) and then running dnu restore from the directory of your project. You can imagine that having to do this each time you want to deploy is a gigantic inconvenience.

While you’re IN DEVELOPMENT, you should run the Windows service under your own account. This allows you to install the service once and not have to manually run dnu restore under another account each time you modify your config (remember, Visual Studio is helping you out here). The service will be able to resolve the dependencies since it’s running under your account as well.

IN PRODUCTION, we can publish our project by running the following command (from the project’s root folder) and point the sc.exe create binPath= to its output:

dnu publish --out ..\publish-output --runtime active --no-source

This command builds and packages your code and copies it and all of its dependencies into an output folder. It will also add a global.json file in a sub-folder within the output directory that tells DNX to load the packages from the local folder and not the user’s profile. If that’s not enough, the command also packages the “active” DNX runtime. This means that the target machine doesn’t require DNX to be installed and doesn’t require you to run dnu restore on the target machine either. With this method, once you copy the published folder to the production machine, you can run the service under the account of your choosing and point to the run.cmd file within the root of the output folder.

sc.exe create <service-name> obj= <machine|domain>\<username> binPath= "\"<output-folder-path>\run.cmd\" --windows-service"

Now you have what you need to run ASP.NET in a Windows service in both development and production environments.

Get the Source

The source code and installation scripts for this article can be found in the aspnet-windows-service Github repo.

In Closing…

We can finally run ASP.NET in our Windows services. This is a big win for us as we host an admin site for our scheduler out of a Windows service. We currently do this using a custom-built static file server and WCF for web services.

There was a lot of information covered in this article, and there is still a lot more that could be covered. If you liked this article or have any questions, please feel free to leave a comment below. Thanks, and I hope you have enjoyed this post.

Not Just .NET: Run node.js Scripts In a Task – Part 1

Taskmatics Scheduler is known for being a powerful tool that .NET developers can use to simplify their task automation. By providing an easy to use API, it allows developers to leverage the power of the scheduling platform to run custom .NET tasks. What might not be well known is that it’s also super easy to run code outside the .NET framework within a custom task. In this three part series, we’re going to walk through how simple it is to do this by creating a task that runs node.js scripts. In the end, you’ll come away with a custom task that you can use as a basis for scheduling tasks in a bunch of different languages.

Why Run node.js Scripts From Taskmatics Scheduler?

Without Taskmatics Scheduler, managing task automation usually means overseeing a growing number of executables or scripts that are scheduled using Windows Task Scheduler or Cron. The end result is usually a cacophony of code where one task may fail and write some error to a database table or file somewhere on the system while another task just fails and doesn’t indicate the underlying reason. Also, trying to keep track of which jobs successfully ran or are in the process of running can be a nightmare. The beauty of Taskmatics Scheduler is that task scheduling, execution and reporting can be managed from one place:  the administration website (‘admin’). Furthermore, logging and reporting for every task is done for you in a centralized and consistent manner.

Taskmatics Scheduler makes it possible to extend the benefits we just covered to other languages and frameworks as well. Since each task instance is spawned off as its own process, you can create your own child processes in every task without having to worry about affecting the overall ecosystem of the Scheduler. This means that any code that be can run from a command line can be run by Taskmatics Scheduler, and you get all the same features and benefits that standard .NET tasks receive.

There are, of course, some pre-requisites before you can run a node.js script (or any other code for that matter) from Taskmatics Scheduler. Node.js isn’t installed as part of the Taskmatics Scheduler installation process, so it’s important that node.js be installed on the computer that has Taskmatics Scheduler installed. This applies to all other languages. The runtimes must be available to execute the code or it simply won’t work. Once that’s out of the way, we can use the Taskmatics Scheduler API to write a task template that can be used to create not only our node.js task, but any other scripting tasks we want to create as well (think Python, Powershell and the like):

public abstract class ExecuteProcessTask : TaskBase
    protected override void Execute()
        var info = new ProcessStartInfo(GetProcessFileName());
        info.Arguments = GetArguments();
        info.RedirectStandardOutput = true;
        info.RedirectStandardError = true;
        info.UseShellExecute = false;
        info.CreateNoWindow = true;

        var process = new Process();
        process.StartInfo = info;
        process.OutputDataReceived += (s, e) => Log("INFO", e.Data);
        process.ErrorDataReceived += (s, e) => Log("ERROR", e.Data);


    protected abstract string GetProcessFileName();
    protected abstract string GetArguments();

    protected virtual void Log(string type, string message)
        if (message == null)

        Context.Logger.Log("{0}: {1}", type, message);

    protected virtual void HandleExitCode(int exitCode)
        if (exitCode == 0)

        throw new ApplicationException("The process failed with exit code " + exitCode + ".");

Analyzing the Code

Tasks in Taskmatics Scheduler inherit from TaskBase and must override the Execute method. This task handles that by using the Process and ProcessStartInfo classes of the .NET framework to create a child process and execute it, redirecting all standard and error output to Taskmatics Scheduler’s centralized logging infrastructure to store and stream all output to the user in the admin, just like any other task. Being an abstract class, it provides the GetProcessFileName method for determining the file system path to the executable that will run our script. The GetArguments method is where logic around creating the arguments to that script is handled.

The Log method bridges the outputs of the child process to the Taskmatics Scheduler real time logger out of the gate, but the method is left virtual which allows for customization of how logging is handled if additional logging infrastructure is needed. Finally, the HandleExitCode method shown above simply throws an exception if the exit code is nonzero, which the Taskmatics Scheduler system will consider a failed task status. Again since the method is virtual it is flexible for those who may need more complex behavior when a process completes.

Another key feature of Taskmatics Scheduler is its extensibility. It’s a snap to use custom parameter objects in these tasks, which makes extending ExecuteProcessTask into a single task that can run any node.js script a piece of cake.

Next Up…

With a simple task, we’ve provided a basis for running code written in any language within Taskmatics Scheduler. This allows you to centralize all of your task automation under one roof, regardless of the language. You also get a consistent execution and logging pattern that can make maintenance of a large number of disparate tasks much easier than using Windows Task Scheduler or Cron. In the next article, we’ll create a simple node.js script and a wrap it in a task using the ExecuteProcessTask class from this article, and we’ll finish up the series by demonstrating how easy it is to schedule our new node.js task in the admin and see it run.

Why use Taskmatics Scheduler

Why use Taskmatics Scheduler?

Taskmatics is preparing for its first major release of its flagship application, Taskmatics Scheduler.  If you’re a .Net developer you should be excited about this application.  To understand why, it helps to understand the motivation behind creating the system in the first place.  We believe that the same reasons that compelled us to create Taskmatics Scheduler will be the same set of criteria that drive .Net developers to adopting it.

The Problem

As .Net developers, most of us have been involved with the development of Enterprise class systems.  These are generally large, often complex applications that:

-        Have encapsulated business logic in code – via services, assemblies, etc…  e.g., the ‘Customer’ object that encapsulates all rules for managing a customer

-        Encapsulates complex business processes involving multiple objects, e.g. onboarding/ingesting product, creating an order, etc…

-        Uses one, or multiple data stores for persistence

-        Have one or multiple UI layers, e.g. an administrative app, consumer facing app, etc…

All of these components work together to form our respective ‘systems’;  However, problems begin to arise when we need to support batch or offline operations utilizing the business rules already built into these systems.  For example:

-        At regular intervals we need to ingest new product into our catalog

-        At specified times we need to check with a 3rd party to see if there are new orders to add to our system

-        ETL activities (import/export data)

-        Rebuild indices, aggregate data

Thus, the need for a centralized job management solution in this type of environment is crucial.

A Pseudo Solution

Like many .Net developers, we turned to Windows Task Scheduler for a solution to our job management needs.  Windows Task Scheduler provides an ability to start batch or .exe files, has a multitude of scheduling options, and is available on any flavor of Windows servers’ installations.

Issues with approach

Windows Task Scheduler views each task under its management as an independent entity.  This approach has some benefits, but also some severe drawbacks that quickly become management headaches as the underlying job infrastructure evolves.  Notably:

-        No common framework, job infrastructure quickly becomes the wild west

-        No centralized logging solution

-        No remote management

-        Very difficult/clumsy mechanism for using shared files (assemblies)

-        No extensibility

-        Single server solution

-        Laborious to update/maintain existing jobs

(Note:  We evaluated SQL Server, Quartz, and ActiveBatch as well.  Please stay tuned for pending discussions regarding these products shortcomings and why we were compelled to create Taskmatics Scheduler)

Finally, a Solution!

We concluded that there was not a suitable task management system available that addressed the needs of an enterprise .Net developer, so we decided to write our own.  From our experience, we were certain that a task management solution needed to have at least the following:

-        Job isolation (a poor performing job can NOT bring down the entire job infrastructure)

-        Remote management

-        Extensibility

-        Common Framework

-        Common Logging

-        Ability to update jobs while jobs are running

-        Ability to leverage common code (assemblies)

-        Reporting

-        Resource utilization by Job

-        Security (access and authorization)

-        Ability to scale out to multiple job servers

-        High Availability configuration

We knew that there weren’t any applications available in the marketplace that supported our desired feature set, so we decided that the only way to get what we needed was to create our own.  So, we did just that.

Our initial version of the application was created for our own internal use.  It was a bare bones application that lacked an administrative console and had a very crude configuration system.  However, our internal adoption and reliance on that crude system convinced us of the needs for this feature set, so after 2 years of evolution and internal use we re-developed the application with a goal of releasing it to the public.

And thus, Taskmatics Scheduler was born!  Taskmatics Scheduler represents all of the knowledge learned from our experience with the initial system and, most importantly, addresses the areas that we found lacking (configuration, installation).  The result is a full featured task management system that is a must have for any .Net developer.