Taskmatics Scheduler 1.1 Released

We’ve released version 1.1.0.0 today which contains many performance improvements as well as some new features that were requested by the community. This means that if you download the installer today you’ll be getting the new version, and we’ve updated the Taskmatics Scheduler package on the Nuget repository with a new version. The primary focus of the build was to improve system-wide performance. In cases where there are a significant number of tasks that run at high frequencies we’ve seen some slowness. With our changes we’ve come to see improvements of between 80% to 90% in terms of time. This isn’t limited to the administration UI screens, but also applies to the reports as well.

How to get it

To get the 1.1.0.0 installer, go here!

What’s Inside

The full list of changes in this release are as follows:

  • Added ability to set exclusion dates in the calendar trigger on which the scheduled task will not run.
  • Fixed an intermittent trigger loading error when adding a trigger to an existing scheduled task.
  • Added performance improvements to all reports.
  • Added performance improvements to dashboard loading.
  • Added performance improvements to task dispatching.
  • Added performance improvements to task history search.
  • Database schema changes to help improve overall system performance.
  • Fix for intermittent out of memory exception when querying resource usage.
  • Added filter to the Dashboard screen to allow users to find scheduled tasks that contain the entered term (similar to filter on other listing screens).
  • Added additional error handling and update messaging to the installer process.

Breaking Changes

While it’s always a goal to be fully backwards compatible, this release does contain one breaking change that may require a rebuild of one or more tasks. In the past, each Task instance was identified with a Globally Unique Identifier, or GUID. These identifiers are great at ensuring that you can have billions of tasks that don’t collide with each other but the downside to their guarantee of uniqueness is that they also are not as performant as using sequential integers. As part of the new release, we’ve re-mapped all task instances to use straight 64 bit integers, which gives us the same uniqueness guarantee, but also performs much better under search.

You should update all of your task libraries to reference the Taskmatics.Scheduler.Core 1.1.0.0 Nuget package. If any of your tasks reference TaskBase.Context.TaskInstanceId, you will need to rebuild and re-deploy those tasks as the TaskBase.Context.TaskInstanceId property is now represented as Int64 instead of Guid.

Upgrade Instructions

As of now, we don’t have an upgradeable installer, which means that to upgrade a 1.0.0.0 installation of Taskmatics Scheduler, you should first uninstall the previous version and then simply run the new 1.1.0.0 installer and point to the existing database where Taskmatics was originally set up to use. Note: Uninstalling Taskmatics Scheduler will NEVER remove the data in your database. We suggest following these steps to install the new version on your system:

  1. Stop all coordinator and agent Windows services (done from the Services snap-in).
  2. Backup your existing database. This is precautionary in case anything unexpected happens during the upgrade.
  3. Take note of the following information from your current installation:
    • Your serial number, which is in your email you got when you purchased Taskmatics Scheduler. You can also find it in the license.xml file in the ProgramData\Taskmatics\Scheduler directory.
    • Your current database server, database name and the runtime credentials. These can be found in the ProgramData\Taskmatics\Scheduler\Taskmatics.Scheduler.Coordination.ServiceHost.exe.config file.
    • The runtime users for both the coordinator and agent Windows services, which you can see from the services snap-in.
    • The root filesets path and the root working folders path, which are found in the .config files located in the ProgramData\Taskmatics\Scheduler directory.
  4. Uninstall the existing Taskmatics Scheduler components (done from the Programs and Features screen in Windows)
  5. Run the new Taskmatics.Scheduler installer. You will need to re-enter the information you collected in step 3 to make sure that all the original permissions and mappings are re-used.

Due to the nature of the performance improvements and the significant amount of database changes that are in this update, it is possible, depending on the size of your database, that installation of the new version could take upwards of 30 minutes (if you have millions of task instances in your DB). Making the upgrade path smoother for our users is on the top of our roadmap so we don’t expect this to be the norm going forward.

More to Come

This is just the beginning of the changes that we have in store. Thanks again for using Taskmatics Scheduler, and we hope you enjoy working with the new version even more. Let us know what features you’d like to see in subsequent versions of Taskmatics Scheduler as we’re always looking for ideas from the community.

Where was that download link again?

In case you missed it at the top, to get the 1.1.0.0 installer, go here!

How to Host ASP.NET in a Windows Service

Today, I’ll be showing how you can finally host ASP.NET in a Windows service. Yes, with ASP.NET 5, it is possible to host ASP.NET in a Windows service. This article builds on a previous one which shows you how to run a DNX (.NET Execution Environment) application in a Windows Service. It’s a quick one, so go read it now.

Project Dependencies

Once you’ve got the shell project set up using the previous article, you’ll need to add in some dependencies.

Open up project.json and add in a dependencies property with the following entries:

  • "Microsoft.AspNet.Hosting": "1.0.0-beta7" – Bootstraps the web server
  • "Microsoft.AspNet.Server.Kestrel": "1.0.0-beta7" – Web server implementation
  • "Microsoft.AspNet.StaticFiles": "1.0.0-beta7" – Hosts static files
  • "Microsoft.AspNet.Mvc": "6.0.0-beta7" – Includes all the MVC packages

Your project.json should now look like this:

{
    "version": "1.0.0-*",
    "description": "MyDnxService Console Application",
    "commands": {
        "run": "MyDnxService"
    },
    "frameworks": {
        "dnx451": {
            "frameworkAssemblies": {
                "System.ServiceProcess": "4.0.0.0"
            }
        }
    },
    "dependencies": {
        "Microsoft.AspNet.Hosting": "1.0.0-beta7",
        "Microsoft.AspNet.Server.Kestrel": "1.0.0-beta7",
        "Microsoft.AspNet.StaticFiles": "1.0.0-beta7",
        "Microsoft.AspNet.Mvc": "6.0.0-beta7"
    }
}

Run in Visual Studio or Command Line

It would really be a pain to develop and debug your application while running as a Windows service. That’s not to say you can’t, but if you are currently developing an application, you probably want to run it from Visual Studio or the command line. To do this, you need some sort of switch in the Main method that will either call ServiceBase.Run or just call directly into OnStart and OnStop. Let’s do this by writing our Main method as follows:

public void Main(string[] args)
{
    if (args.Contains("--windows-service"))
    {
        Run(this);
        return;
    }

    OnStart(null);
    Console.ReadLine();
    OnStop();
}

Simply check for the presence of the --windows-service argument, call ServiceBase.Run and you are good to go. Now you can just run and debug your application from Visual Studio by hitting F5. When you want to install the application as a Windows service, don’t forget to pass --windows-service at the end of the binPath= argument.

sc.exe create <service-name> binPath= "\"<dnx-exe-path>\" \"<project-path>\" run --windows-service"

Configure and Start the Server and ASP.NET

Let’s add some namespaces:

using Microsoft.AspNet.Builder;
using Microsoft.AspNet.Hosting;
using Microsoft.AspNet.Hosting.Internal;
using Microsoft.Framework.Configuration;
using Microsoft.Framework.Configuration.Memory;
using Microsoft.Framework.DependencyInjection;
using System;
using System.Diagnostics;
using System.Linq;
using System.ServiceProcess;

Before we start the server, we need to set up a few fields and a constructor on the Program class. We need to store the hosting engine, its shutdown function and a service provider that we will use soon. The IServiceProvider instance will be injected by the DNX runtime.

private readonly IServiceProvider _serviceProvider;
private IHostingEngine _hostingEngine;
private IDisposable _shutdownServerDisposable;

public Program(IServiceProvider serviceProvider)
{
    _serviceProvider = serviceProvider;
}

To get the server up and running, we will use the WebHostBuilder class. Your Program.OnStart method should look as follows:

protected override void OnStart(string[] args)
{
    var configSource = new MemoryConfigurationSource();
    configSource.Add("server.urls", "http://localhost:5000");

    var config = new ConfigurationBuilder(configSource).Build();
    var builder = new WebHostBuilder(_serviceProvider, config);
    builder.UseServer("Microsoft.AspNet.Server.Kestrel");
    builder.UseServices(services => services.AddMvc());
    builder.UseStartup(appBuilder =>
    {
        appBuilder.UseDefaultFiles();
        appBuilder.UseStaticFiles();
        appBuilder.UseMvc();
    });

    _hostingEngine = builder.Build();
    _shutdownServerDisposable = _hostingEngine.Start();
}

There are several steps involved here:

  1. Create and populate the server configuration. (lines 3-6)
  2. Create the builder and tell it what server implementation to use. (lines 7-8)
  3. Configure services using the built-in dependecy injection support. (line 9)
  4. Configure the ASP.NET middleware pipeline. (lines 10-15)
  5. Build and start the server. (lines 17-18)

The previous code is oversimplified for the purposes of this article. We are hardcoding the server URL into an in memory config store, but you can set this up in other ways. See the ASP.NET Configuration repo on Github for other options.

To gracefully shut down the server, implement Program.OnStop as follows:

protected override void OnStop()
{
    if (_shutdownServerDisposable != null)
        _shutdownServerDisposable.Dispose();
}

Getting Down to Business

Now that we have a server set up and running ASP.NET with static files and MVC, let’s add some content and a controller.

First, create an index.html file and add “Hello world” as the content. Then create a controller file TimeController.cs with the file contents as follows:

using Microsoft.AspNet.Mvc;
using System;

namespace MyDnxService
{
    public class TimeController
    {
        [Route("time")]
        public DateTime GetTime()
        {
            return DateTime.Now;
        }
    }
}

That’s all there is to it. Now we can pull up http://localhost:5000 to see “Hello world” and http://localhost:5000/time to see the current time.

The Runtime User and Going to Production

When you install a Windows service, you have to specify a user under which to run the process. If you don’t, “Local System” is the default. Why is this important?

In the previous article we simply ran our Windows service as the default (Local System) user. It turns out we got lucky since we didn’t reference any package dependencies. If we try to do the same thing in this case, the service will quickly fail to start because “Local System” won’t be able to resolve any of the package dependencies we just added.

As you add package references to the project.json file, Visual Studio quietly runs dnu restore in the background and downloads the packages to your user profile (c:\users\<username>\.dnx\packages). When you run dnx.exe, it will resolve any dependencies from your user profile. (You’ll see how to override this in a bit.)

Since “Local System” doesn’t have Visual Studio helping it out, we need to somehow get those packages installed someplace that it can see. Running dnu restore as “Local System” will download the packages to c:\Windows\SysWOW64\config\systemprofile\.dnx\packages (on the 64-bit OS) since there’s no user profile for that account. How can you run dnu restore as “Local System”? This can be done by running psexec.exe -s cmd.exe from Sysinternals (runs a cmd.exe console as “Local System”) and then running dnu restore from the directory of your project. You can imagine that having to do this each time you want to deploy is a gigantic inconvenience.

While you’re IN DEVELOPMENT, you should run the Windows service under your own account. This allows you to install the service once and not have to manually run dnu restore under another account each time you modify your config (remember, Visual Studio is helping you out here). The service will be able to resolve the dependencies since it’s running under your account as well.

IN PRODUCTION, we can publish our project by running the following command (from the project’s root folder) and point the sc.exe create binPath= to its output:

dnu publish --out ..\publish-output --runtime active --no-source

This command builds and packages your code and copies it and all of its dependencies into an output folder. It will also add a global.json file in a sub-folder within the output directory that tells DNX to load the packages from the local folder and not the user’s profile. If that’s not enough, the command also packages the “active” DNX runtime. This means that the target machine doesn’t require DNX to be installed and doesn’t require you to run dnu restore on the target machine either. With this method, once you copy the published folder to the production machine, you can run the service under the account of your choosing and point to the run.cmd file within the root of the output folder.

sc.exe create <service-name> obj= <machine|domain>\<username> binPath= "\"<output-folder-path>\run.cmd\" --windows-service"

Now you have what you need to run ASP.NET in a Windows service in both development and production environments.

Get the Source

The source code and installation scripts for this article can be found in the aspnet-windows-service Github repo.

In Closing…

We can finally run ASP.NET in our Windows services. This is a big win for us as we host an admin site for our scheduler out of a Windows service. We currently do this using a custom-built static file server and WCF for web services.

There was a lot of information covered in this article, and there is still a lot more that could be covered. If you liked this article or have any questions, please feel free to leave a comment below. Thanks, and I hope you have enjoyed this post.

How to Run DNX Applications in a Windows Service

Today I’m going to show you how you can run a DNX application in a Windows service because we recently needed to do this, and while there are many posts out there asking how to do it, the answers would lead you to believe it’s impossible. For this example we’re going to be using the beta6 version of ASP.NET 5. If you don’t yet have DNX, you should go here to get started.

Creating the Application

For demonstrative purposes, we’ll be creating a very simple DNX console application that will simply write to the event log when the service is started/stopped. To begin, create a new DNX Console Application (Package) in Visual Studio. You can find this template in the “Web” category, for some reason. Next we need to make a few tweaks to the project.json file:

  • First, remove the DNX Core reference, this is a windows service after all, so no point in trying to be cross platform today.
  • Under frameworks:dnx451, add a frameworkAssemblies reference to "System.ServiceProcess": "4.0.0.0"
  • Change your command key to something simple like “run” or “runservice”. You don’t have to do this, but it makes it more clear later what we’re doing.

When you’re all done, your project.json should look something like this:

{
    "version": "1.0.0-*",
    "description": "MyDnxService Console Application",
    "commands": {
        "run": "MyDnxService"
    },
    "frameworks": {
        "dnx451": {
            "frameworkAssemblies": {
                "System.ServiceProcess": "4.0.0.0"
            }
        }
    }
}

Next, make the Program class inherit from System.ServiceProcess.ServiceBase. This lets us create overrides for OnStart and OnStop service methods. We’re going to use these methods to simply log out messages to the event log. Finally, in the Main(string[] args) method of the Program class we add Run(this); in order to bootstrap the Windows service. Here is our program.cs file in it’s entirety:

using System.Diagnostics;
using System.ServiceProcess;

namespace MyDnxService
{
    public class Program : ServiceBase
    {
        private readonly EventLog _log = 
            new EventLog("Application") { Source = "Application" };

        public void Main(string[] args)
        {
            _log.WriteEntry("Test from MyDnxService.", EventLogEntryType.Information, 1);
            Run(this);
        }

        protected override void OnStart(string[] args)
        {
            _log.WriteEntry("MyDnxService started.");
        }

        protected override void OnStop()
        {
            _log.WriteEntry("MyDnxService stopped.");
        }
    }
}

Installing the Service

Once our application is written and building successfully we can install it as a service. To do this, open a command prompt in administrator mode and enter the following command:

sc.exe create <service-name> binPath= "\"<dnx-exe-path>\" \"<project-path>\" run"

UPDATE
If you’re targeting beta8 or beyond, the way we tell dnx.exe where to find our project has changed to include a –project (-p for short) argument. So the above command changes to:
sc.exe create <service-name> binPath= "\"<dnx-exe-path>\" -p \"<project-path>\" run"

sc.exe is a built-in tool that lets us perform a number of operations on a Windows service. Here we’re using it to create a new service. The very first argument that the create operation needs is the service name. This is the name that will show up in the services snap in. You can set this to any value, but since our example application is called MyDnxService we put our adventurous nature aside and simply used that.

The binPath= parameter is the only other parameter we’re specifying to sc.exe, even though it looks like three separate parameters. We’re basically telling sc.exe what command to execute when starting the service, which is DNX. The other parameters after that are actually parameters to dnx.exe, not sc.exe. Because there are spaces in the argument value, we are wrapping the entire argument in quotes (I’ll explain the escaped quotes in a minute). One gotcha to keep in mind when working with sc.exe is that the “=” you see after “binPath” is actually part of the parameter name for sc.exe, so that space you see between binPath= and the value is necessary as it tells sc.exe that we’re moving from the parameter name to the value. Now let’s look at the three components in that binPath= argument:

  • Since dnx.exe is the application that runs a DNX application, we’re going to need to point to it via its full path. If you install DNX via DNVM the default install directory (at least on our machines) is c:\users\\.dnx\runtimes\\bin\dnx.exe but if you aren’t sure where on your machine it is just open a command prompt and run where.exe dnx.exe and it should tell you where to find it. Since the path to DNX could have spaces, we’re wrapping that parameter in quotes too, but since these quotes are inside the quotes we specified for the binPath= parameter to sc.exe, we need to escape them.
  • Next we need to tell DNX where to find our application. If you use dnu publish to publish your application it will generate a .cmd file for you that you normally would use to launch your application. You cannot use that .cmd file when running your application in a Windows service but if you open that .cmd file you’ll see a path for --appbase and that’s the one we want. Again, escape the quotes around this path for safety.
  • Finally we tell DNX what command within our application to run. Remember above in the project.json file we named our command key “run”? that’s what this value specifies. So if you named your command key something else, just replace the “run” parameter to DNX with whatever command key name you chose in your project.json.

So putting all of that together, the complete sc.exe command we used for the application above (without the generic tokens) was:
sc.exe create MyDnxService binpath= "\"C:\Users\dave\.dnx\runtimes\dnx-clr-win-x86.1.0.0-beta6\bin\dnx.exe\" \"C:\Users\dave\Desktop\DnxWindowsService\src\MyDnxService\" run"

UPDATE
As mentioned above, if you’re targeting beta8 or beyond, the way we tell dnx.exe where to find our project has changed to include a –project (-p for short) argument. So the above command changes to:
sc.exe create MyDnxService binpath= "\"C:\Users\dave\.dnx\runtimes\dnx-clr-win-x86.1.0.0-beta8\bin\dnx.exe\" -p \"C:\Users\dave\Desktop\DnxWindowsService\src\MyDnxService\" run"

Running the Service

Now that our application has been installed as a service, it’s time to revel in the fruits of our labor. To run the service, open the Services MMC Snap-in, find the service we installed by its name and start it. Open Event Viewer, go to the Application log and you should see the text that we output from our OnStart() override in our application. Stopping the service and refreshing the event log will show the stop message. As I said earlier, this is a super basic demonstration of functionality, but it’s the basis for running any DNX application in the context of a Windows service.

Why Do This?

Taskmatics Scheduler currently hosts the administration website from inside one of the Windows services that are installed. In its current implementation we couldn’t support ASP.NET applications because only IIS would provide the environment necessary to properly run each request through the pipeline. With ASP.NET 5, the game changes, and you can host ASP.NET applications from a console application without being dependent on IIS. After some headbanging we are successfully running our MVC 6 web application inside of a Windows service, which you can now read about here.

Update!

Check out Erez’s follow up post showing you how to host a fully functional ASP.NET site from a Windows service, which has just been posted!

Monitoring Flights and Sending SMS with Taskmatics Scheduler and Enzo Unified

Software developers need to build solutions quickly so that businesses remain competitive and agile. This blog post shows you how Taskmatics Scheduler and Enzo Unified can help developers build and deploy solutions very quickly by removing two significant pain points: the learning curve of new APIs, and orchestrating Internet services.

Sample Solution

Let’s build a solution that checks incoming flights in Miami, Florida and send a text message using SMS when new flights arrive to one or more phone numbers. To track flight arrivals, we will be using the FlightAware service which provides a REST API to retrieve flight information. To send SMS messages, we will be using Twilio’s service which provides an API as well for sending messages.

To remove the learning from these APIs, we used Enzo Unified, a Backend as a Service (BaaS) platform that enables the consumption of services through native SQL statements. Enzo Unified abstracts communication and simplifies development of a large number of internal systems and Internet services. In this example, Enzo Unified is hosted on the Microsoft Azure platform for scalability and operational efficiency.

To orchestrate and schedule the solution, we used the Taskmatics Scheduler platform. Taskmatics calls into your custom code written in .NET on a schedule that you specify, which is configured to connect to Enzo Unified in the cloud. The call to Enzo Unified is made using ADO.NET by sending native SQL statements to pull information from FlightAware, and send an SMS message through Twilio. At a high level, the solution looks like this:

High Level call sequence between Taskmatics Scheduler and Enzo Unified

High Level call sequence between Taskmatics Scheduler and Enzo Unified

How To Call FlightAware and Twilio with Enzo Unified

Developers can call Enzo Unified using a REST interface, or a native SQL interface. In this example, the developer uses the SQL interface, leveraging ADO.NET. The following code connects to Enzo Unified as a database endpoint using the SqlConnection class, and sends a command to fetch flights from a specific airport code using an SqlCommand object. Fetching FlightAware data is as simple as calling the “Arrived” stored procedure against the “flightaware” database schema.

var results = new List<ArrivedFlightInfo>();

// Connect to Enzo Unified using SqlConnection
using (var connection = new SqlConnection(parameters.EnzoConnectionString))
  // Prepare call to FlightAware's Arrived procedure 
  using (var command = new SqlCommand("flightaware.arrived", connection))
  {
    connection.Open();
    command.CommandType = System.Data.CommandType.StoredProcedure;
    command.Parameters.Add(new SqlParameter("airport", airportCode));
    command.Parameters.Add(new SqlParameter("count", 10));
    command.Parameters.Add(new SqlParameter("type", "airline"));

    // Call FlightAware's Arrived procedure 
    using (var reader = command.ExecuteReader())
      while (reader.Read())
        results.Add(new ArrivedFlightInfo
        {
          Ident = (String)reader["ident"],
          AircraftType = (String)reader["aircrafttype"],
          OriginICAO = (String)reader["origin"],
          OriginName = (String)reader["originName"],
          DestinationName = (String)reader["destinationName"],
          DestinationCity = (String)reader["destinationCity"]
          // ... additional code removed for clarity...
        });
    }

Calling Twilio is just as easy. A simple ADO.NET call to the SendSMS stored procedure in the “Twilio” schema is all that’s needed (the code is simplified to show the relevant part of the call).

// Establish a connection Enzo Unified
using (var connection = new SqlConnection(parameters.EnzoConnectionString))
  using (var command = new SqlCommand("twilio.sendsms", connection))
  {
    connection.Open();
    command.CommandType = System.Data.CommandType.StoredProcedure;
    command.Parameters.Add(new SqlParameter("phones", phoneNumbers));
    command.Parameters.Add(new SqlParameter("message", smsMessage));

    // Call Twilio’s SendSMS method
    command.ExecuteReader();
  }

If you inspect the above code carefully, you will notice that it does not reference the APIs of FlightAware or Twilio. Indeed, calling both FlightAware and Twilio was done using ADO.NET calls against Enzo Unified; because Enzo Unified behaves like a native database server (without the need to install special ODBC drivers), authenticating, making the actual API calls, and interpreting the REST results was entirely abstracted away from the developer, and replaced by an SQL interface, which dramatically increases developer productivity. Database developers can call Enzo Unified directly to test FlightAware and Twilio using SQL Server Management Studio (SSMS). The following picture shows the results of calling Enzo Unified from SSMS to retrieve arrived flights from FlightAware.

Calling the FlightAware service using simple SQL syntax in SQL Server Management Studio

Calling the FlightAware service using simple SQL syntax in SQL Server Management Studio

Sending a SMS text message using Twilio is just as simple using SSMS:

Calling the Twilio service using simple SQL syntax in SQL Server Management Studio

Calling the Twilio service using simple SQL syntax in SQL Server Management Studio

How To Schedule The Call With Taskmatics Scheduler

In order to run and schedule this code, we are using Taskmatics Scheduler, which provides an enterprise grade scheduling and monitoring platform. When a class written in .NET inherits from the Taskmatics.Scheduler.Core.TaskBase class, it becomes automatically available as a custom task inside the Taskmatics Scheduler user interface. This means that a .NET library can easily be scheduled without writing additional code. Furthermore, marking the custom class with the InputParameters attribute provides a simple way to specify input parameters (such as the airport code to monitor, and the phone numbers to call) for your task through the Taskmatics user interface.
The following simplified code shows how a custom task class is created so that it can be hosted inside the Taskmatics Scheduler platform. Calling Context.Logger.Log gives developers the ability to log information directly to Taskmatics Scheduler for troubleshooting purposes.

namespace Taskmatics.EnzoUnified.FlightTracker
{
    // Mark this class so it is visible in the Taskmatics interface
    [InputParameters(typeof(FlightNotificationParameters))]
    public class FlightNotificationTask : TaskBase
    {
        // Override the Execute method called by Taskmatics on a schedule
        protected override void Execute()
        {
	     // Retrieve parameters as specified inside Taskmatics
            var parameters = (FlightNotificationParameters)Context.Parameters;

            // Invoke method that calls FlightAware through Enzo Unified
            var arrivedFlights = GetArrivedFlights(parameters);

            // do more work here… such as identify new arrivals
            var newFlights = FlightCache.FilterNewArrivals(arrivedFlights);

            // Do we have new arrivals since last call?
            if (newFlights.Count > 0)
            {
               // Invoke method that calls Twilio through Enzo Unified
               var results = SendArrivedFlightsViaSMS(newFlights, parameters);

		  // Update cache so these flights won’t be sent through SMS again
		  FlightCache.SaveFlightsToCache(newFlights); 
            }
            else
                Context.Logger.Log("SMS phase skipped due to no new arrivals.");

            Context.Logger.Log("Job execution complete.");
        }
    }
}

Installing the task into the Taskmatics Scheduler platform is very straightforward. Log into the user interface and create a definition for the flight tracker task. This step allows you to import your library into the system to serve as a template for the new scheduled task that we will create next.

Import your custom task as a definition

Import your custom task as a definition

Schedule your custom task to run on the days and times you specify.

Schedule your custom task to run on the days and times you specify.

Once you have created your definition, go to the “Scheduled Tasks” section of the user interface, and create the task by selecting the definition that you just created from the Task dropdown. This is also where you will schedule the time and frequency that the task will run as well as configure the input parameters for the task.

Configure the parameters for the scheduled task.

Configure the parameters for the scheduled task.

Finally, from the Dashboard screen, you can run your task manually and watch the output live, or look at a past execution of the task to see the outcome and logs from that run. In the image below, you can see the execution of the Flight Tracking task where we monitored recent arrivals into the Miami International Airport (KMIA).

Review and analyze previous task executions or watch your tasks live as they run.

Review and analyze previous task executions or watch your tasks live as they run.

Conclusion

This blog post shows how developers can easily build integrated solutions without having to learn complex APIs using simple SQL statements, thanks to Enzo Unified’s BaaS platform. In addition, developers can easily orchestrate and schedule their libraries using the Taskmatics Scheduler platform. Combining the strengths of Enzo Unified and Taskmatics, organizations can reap the following benefits:

  • Rapid application development by removing the learning curve associated with APIs
  • Reduced testing and simple deployment by leveraging already tested services
  • Service orchestration spanning Internet services and on-premises systems
  • Enterprise grade scheduling and monitoring

About Blue Syntax Consulting

Our mission is to make your business successful through the technologies we build, create innovative solutions that are relevant to the technical community, and help your company adopt cloud computing where it makes sense. We are now making APIs irrelevant with Enzo® Unified. For more information about Enzo Unified and how developers can access services easily using SQL statements or a simple REST interface, visit http://www.enzounified.com or contact Blue Syntax Consulting at info@bluesyntaxconsulting.com.

About Taskmatics

Taskmatics was founded by a group of developers looking to improve the productivity of their peers. Their flagship application, Taskmatics Scheduler, aims to boost developer productivity and reduce the effort involved in creating consistent and scalable tasks while providing a centralized user interface to manage all aspects of your task automation. For more information and a free 90-day trial, visit us or email us at info@taskmatics.com.

Not Just .NET: Run node.js Scripts In a Task – Part 1

Taskmatics Scheduler is known for being a powerful tool that .NET developers can use to simplify their task automation. By providing an easy to use API, it allows developers to leverage the power of the scheduling platform to run custom .NET tasks. What might not be well known is that it’s also super easy to run code outside the .NET framework within a custom task. In this three part series, we’re going to walk through how simple it is to do this by creating a task that runs node.js scripts. In the end, you’ll come away with a custom task that you can use as a basis for scheduling tasks in a bunch of different languages.

Why Run node.js Scripts From Taskmatics Scheduler?

Without Taskmatics Scheduler, managing task automation usually means overseeing a growing number of executables or scripts that are scheduled using Windows Task Scheduler or Cron. The end result is usually a cacophony of code where one task may fail and write some error to a database table or file somewhere on the system while another task just fails and doesn’t indicate the underlying reason. Also, trying to keep track of which jobs successfully ran or are in the process of running can be a nightmare. The beauty of Taskmatics Scheduler is that task scheduling, execution and reporting can be managed from one place:  the administration website (‘admin’). Furthermore, logging and reporting for every task is done for you in a centralized and consistent manner.

Taskmatics Scheduler makes it possible to extend the benefits we just covered to other languages and frameworks as well. Since each task instance is spawned off as its own process, you can create your own child processes in every task without having to worry about affecting the overall ecosystem of the Scheduler. This means that any code that be can run from a command line can be run by Taskmatics Scheduler, and you get all the same features and benefits that standard .NET tasks receive.

There are, of course, some pre-requisites before you can run a node.js script (or any other code for that matter) from Taskmatics Scheduler. Node.js isn’t installed as part of the Taskmatics Scheduler installation process, so it’s important that node.js be installed on the computer that has Taskmatics Scheduler installed. This applies to all other languages. The runtimes must be available to execute the code or it simply won’t work. Once that’s out of the way, we can use the Taskmatics Scheduler API to write a task template that can be used to create not only our node.js task, but any other scripting tasks we want to create as well (think Python, Powershell and the like):

public abstract class ExecuteProcessTask : TaskBase
{
    protected override void Execute()
    {
        var info = new ProcessStartInfo(GetProcessFileName());
        info.Arguments = GetArguments();
        info.RedirectStandardOutput = true;
        info.RedirectStandardError = true;
        info.UseShellExecute = false;
        info.CreateNoWindow = true;

        var process = new Process();
        process.StartInfo = info;
        process.Start();
        process.OutputDataReceived += (s, e) => Log("INFO", e.Data);
        process.ErrorDataReceived += (s, e) => Log("ERROR", e.Data);
        process.BeginOutputReadLine();
        process.BeginErrorReadLine();
        process.WaitForExit();

        HandleExitCode(process.ExitCode);
    }

    protected abstract string GetProcessFileName();
    protected abstract string GetArguments();

    protected virtual void Log(string type, string message)
    {
        if (message == null)
            return;

        Context.Logger.Log("{0}: {1}", type, message);
    }

    protected virtual void HandleExitCode(int exitCode)
    {
        if (exitCode == 0)
            return;

        throw new ApplicationException("The process failed with exit code " + exitCode + ".");
    }
}

Analyzing the Code

Tasks in Taskmatics Scheduler inherit from TaskBase and must override the Execute method. This task handles that by using the Process and ProcessStartInfo classes of the .NET framework to create a child process and execute it, redirecting all standard and error output to Taskmatics Scheduler’s centralized logging infrastructure to store and stream all output to the user in the admin, just like any other task. Being an abstract class, it provides the GetProcessFileName method for determining the file system path to the executable that will run our script. The GetArguments method is where logic around creating the arguments to that script is handled.

The Log method bridges the outputs of the child process to the Taskmatics Scheduler real time logger out of the gate, but the method is left virtual which allows for customization of how logging is handled if additional logging infrastructure is needed. Finally, the HandleExitCode method shown above simply throws an exception if the exit code is nonzero, which the Taskmatics Scheduler system will consider a failed task status. Again since the method is virtual it is flexible for those who may need more complex behavior when a process completes.

Another key feature of Taskmatics Scheduler is its extensibility. It’s a snap to use custom parameter objects in these tasks, which makes extending ExecuteProcessTask into a single task that can run any node.js script a piece of cake.

Next Up…

With a simple task, we’ve provided a basis for running code written in any language within Taskmatics Scheduler. This allows you to centralize all of your task automation under one roof, regardless of the language. You also get a consistent execution and logging pattern that can make maintenance of a large number of disparate tasks much easier than using Windows Task Scheduler or Cron. In the next article, we’ll create a simple node.js script and a wrap it in a task using the ExecuteProcessTask class from this article, and we’ll finish up the series by demonstrating how easy it is to schedule our new node.js task in the admin and see it run.