Category Archives: Tips

Post Installation Steps for WPS Workstations

We recently wrote a short technical document on a set of post installation steps that MineQuest Business Analytics recommends after you install WPS on your workstation. We are often asked what needs to be done after WPS is installed to get the greatest performance out of WPS without too much hassle.

The document walks you through modifying your WPS configuration file, moving your work folder to another drive, why you want to install R (for using PROC R of course!), creating an autoexec.sas file, turning out write caching and a few other pointers. You don’t need to to all of the suggestions, after all they are just suggestions, but they are useful modifications that will enable you to get more out of WPS on your workstation.

You can find the document “Post Installation Steps for WPS Workstations” in the Papers Section of the MineQuest website.

About the author: Phil Rack is President of MineQuest Business Analytics, LLC located in Grand Rapids, Michigan. Phil has been a SAS language developer for more than 25 years. MineQuest provides WPS and SAS consulting and contract programming services and is a authorized reseller of WPS in North America.

Plotting Points on a Street Level Map using the Bridge to R and WPS

In the last few installments of this blog, I have shown how you can use WPS and the Bridge to R to calculate drive distances, geocode records and pull down a map from Google maps. I want to use this post to pull all this together and show how you can geocode your addresses and plot them on a street level map.

First some background you need to know about using Google for geocoding and mapping. There are limits to what Google will allow you to do with their services before they want you to start paying. You can geocode 2,500 records a day for free. You can pull down 25,000 maps a day for free. Once you start moving past these limits, there are fees involved.

One thing that you should probably start to consider is caching records locally that have been geocoded so that you don’t have to go back to the Google geocoder every time you want to plot some points on a map. I could easily run through 2,500 addresses in a day. The limitations on the number of maps is just not an issue for me. I think 25,000 maps a day is a very liberal offering for the kind of work that I would want to use the service for.

In the sample code below, I split the mapping process into two components for ease in understanding the entire process. I first geocode the file to get the latitude and longitude for each record. The second part of the process is creating a map and using the lat’s and long’s to plot points on the map. We could have put this into a single step but it wouldn’t be as clear or as flexible.

Without further ado, here’s the code using the Bridge to R and WPS.

data gasstations;
input company $1-29 address $30-52 city $53-64 state $66-67;
addr2geocode=trim(address)||', '||trim(city)||', '||trim(state);
cards;
Citgo Gas Station            5189 28th St Se        Grand Rapids MI
28th Street BP               5155 28th St Se        Grand Rapids MI
Twenty-Eighth Street C Store 5556 28th St Se        Grand Rapids MI
Speedway                     4045 28th St Se        Grand Rapids MI
Speedway                     2305 E Paris Ave Se    Grand Rapids MI
Superamerica                 2305 E Paris Ave Se    Grand Rapids MI
Shell Food Mart              3960 28th St Se        Grand Rapids MI
Admiral Petroleum            3927 28th St Se        Grand Rapids MI
Cascade C Store              4591 Cascade Rd Se     Grand Rapids MI
Friendly Food Shops          6799 Cascade Rd Se     Grand Rapids MI
Family Fare Quick Stop       6799 Cascade Rd Se     Grand Rapids MI
Cascade Citgo                6820 Cascade Rd Se     Grand Rapids MI
Dutton Fuel Mart LLC         2560 E Beltline Ave Se Grand Rapids MI
Centerpointe Marathon        2560 E Beltline Ave Se Grand Rapids MI
Shell Food Mart              2600 E Beltline Ave Se Grand Rapids MI
Speedway                     4018 Cascade Rd Se     Grand Rapids MI
Grand Rapids Gas Incorporated3214 28th St Se        Grand Rapids MI
Cascade Shell                4033 Cascade Rd Se     Grand Rapids MI
Speedway                     4665 44th St Se        Kentwood     MI
Super Petroleum Incorporated 2411 28th St Se        Grand Rapids MI
;;;;
run;


*--> Geocode the addresses using the Google Geocoder. Keep the geocoded records
     in the output dataset names locs for further processing.;

%rstart(dataformat=csv,data=gasstations,rGraphicsFile=);
datalines4;

## options(repos=structure(c(CRAN="http://cran.case.edu/")))
## install.packages("ggmap", dependencies = FALSE)

   attach(gasstations)

   library(ggmap)

   gaddress <- as.character(gasstations$addr2geocode)
   locs <- geocode(gaddress,output="more")

;;;;
%rstop(import=locs);


*--> Pull a Google map that is centered on a particular address and plot the locations
     on the map. Use the data set that was created (locs) above that contains the 
     lat and longs to plot the points.;
     
Title 'Gas Stations on or Near 28th Street';
Title2 'Grand Rapids, Michigan';     
     
%rstart(dataformat=csv,data=locs,rGraphicsFile=);
datalines4;

attach(locs)
addr <- locs;

library(ggmap)

map.center <- geocode('3960 28th St Se, Grand Rapids, MI');

 grmap <- qmap(c(lon=map.center$lon, lat=map.center$lat), zoom = 13,color = 'color', legend = 'topleft')
          grmap +geom_point(aes(x = lon, y = lat, size=3.0), data = addr)

;;;;
%rstop(import=);

The map that is created looks like this:

GR_Stations

I cropped this down a bit and got rid of the borders so that it would be easier to view on this blog. Note the black points on the map that indicate the locations of the gas stations. We could continue this exercise by plotting a label to the points with the names of the service stations but that would be a good exercise for the reader who wants to learn more about using ggmap and street level mapping.

If you want to learn more about ggmap and street level mapping, I encourage you to take a look at the following document, “ggmap: Spatial Visualization with ggplot2 – the R Journal” and can be viewed in PDF format here. What I have presented is really a quick and dirty set of examples that just begin to scratch the surface of what ggmap can do for you.

About the author: Phil Rack is President of MineQuest Business Analytics, LLC located in Grand Rapids, Michigan. Phil has been a SAS language developer for more than 25 years. MineQuest provides WPS and SAS consulting and contract programming services and is a authorized reseller of WPS in North America.

Creating a Street Level Map with WPS and the Bridge to R

Creating a street level map using the Bridge to R and WPS is actually pretty easy. As in our other examples (see the two previous blogs) we again use ggmap to pull down a map from Google Maps and display it using HTML. Amazingly, this only takes four lines of R code. Here’s an example:

 

%rstart(dataformat=man,data=,rGraphicsFile=);
datalines4;

   library(ggmap)

   bp <- "4045 28th St Se, Grand Rapids, MI, USA"
   qmap(bp, zoom=12)
   print(bp)

;;;;
%rstop(import=);

The code is fairly easy to follow. We load the ggmap library that will do most of the work for us. We center the map using the address “4045 28th St Se, Grand Rapids, MI, USA”. The next line queries the map with a specified zoom level (we are using zoom level 12). Finally, we print the map using the print function.

This is what the map looks like.

b2rplt_1700486050_2_1

 

We can actually take this a bit further. Instead of using a known address, we can us a place of interest for querying and creating the map. If we replace the address in the code above with “White House, Washington DC, USA” we get a map like below.

b2rplt_1700487946_3_1

So know we have seen how easy it is to pull down a map from Google using ggmap and the Bridge to R for WPS. If you have a copy of the Bridge to R, I recommend you play with the demonstration programs to get an idea of what you can do with the software and the mapping service. It’s always fun to see what gets rendered using ggmap, R and the Bridge to R.

About the author: Phil Rack is President of MineQuest Business Analytics, LLC located in Grand Rapids, Michigan. Phil has been a SAS language developer for more than 25 years. MineQuest provides WPS and SAS consulting and contract programming services and is a authorized reseller of WPS in North America.

Creating a WPS Launch Icon in Ubuntu

I use Ubuntu for my WPS Linux OS and it’s pretty easy to install. However, unlike the vast majority of people out there who run it in batch mode; I like to run it in interactive mode using the Eclipse Workbench. Hence I want an icon that I can click on to start WPS. Here’s how to do it.

On the Ubuntu desktop, right mouse click on an empty part of the screen and you will get a little option menu. Click on “Create Launcher…” You will see a dialog box pop up that looks like:

clip_image002

On my Ubuntu Linux Server, I installed WPS into a folder named wps-3.0.1. The directions below use that folder name as our example. You may have installed WPS into another folder so be sure to consider that when performing the tasks below.

Name: WPS 3.01

Command: /home/minequest/wps-3.0.1/eclipse/workbench

Comment: WPS 3.0.1 Linux

Click on the icon on the upper left hand of the Create Launcher Dialog Box (the little spring) and you will get a choose icon list box. Simply go to the WPS install folder and go into the eclipse folder. There you will find a file named icon.xpm. Click on icon.xpm and then click Open and then click OK.

That’s all there is to it. You should have the WPS icon installed and available from your desktop.

About the author: Phil Rack is President of MineQuest Business Analytics, LLC located in Grand Rapids, Michigan. Phil has been a SAS language developer for more than 25 years. MineQuest provides WPS and SAS consulting and contract programming services and is a authorized reseller of WPS in North America.

WPS on Windows 8

Today is a big day for Microsoft. The company released Windows 8 to the public and depending who or what you read, it’s either really great or something you don’t want.

I went ahead and bought the $39.95 upgrade but have yet to install it. I will put it on my main workstation but I need a long weekend in case there are problems.

For those who use WPS, let it be known that WPS runs fine on Windows 8 as long as you download a current maintenance release. I’m not privy to what is involved in writing an application that is Windows 8 compatible but it’s nice to know that you can be running the latest and greatest if you want.

Also, if you have Windows Server 2012 and want to run WPS on that platform, you will also need to download and install the current maintenance release of WPS.

More on Disk Caches, WPS and FancyCache

Earlier, I wrote about FancyCache as a disk cache when running WPS and the performance improvements you can expect to see. There are a couple more things that come about when using FancyCache that are worth mentioning.

First, by using FancyCache you can reduce the number of writes to your SSD(s) and thus increase the life of your SSD. This is accomplished by consolidating multiple writes to the same address when you use the deferred write option.

Second, Fancycache supports TRIM which helps by avoiding writing deleted data. With FancyCache’s support for deferred writes, FancyCache removes the corresponding writes in the cache and avoids writing redundant data.

Third, for WPS users in particular, if you have a large enough cache and set deferred writes to say 600 seconds, many of your smaller datasets can still reside in the cache (as opposed to being written to disk) and with liberal use of PROC DATASETS or PROC DELETE to remove unnecessary datasets, the data is never written to the SSD. Thus prolonging the life of the SSD by minimizing writes.

About the author: Phil Rack is President of MineQuest, LLC and has been a SAS language developer for more than 25 years. MineQuest provides WPS and SAS consulting and contract programming services and is a reseller of WPS in North America.

Analytic Workstations – Part III

Analytic Workstations – Part III

After giving some thought on how I could improve some of the benchmarks that I presented in the previous blog post – Analytic Workstations Part II, I realized that I had three choices. I could opt for a faster processor such as an eight core Intel I7, add more memory or change out my I/O system where my temp work space resides.

Given the $500 budget and that I have $182 left to spend, the CPU upgrade would be prohibitive. I could swap out my 16GB of RAM and add 32GB of RAM but that too would put me over the $182 build budget. The CPU upgrade would cost about $260 and the memory upgrade around $200. That leaves me with a disk system upgrade that I can easily do with the $182 remaining in the kitty.

I decided to buy two 60GB SSD’s and put them in RAID 0. That would provide me with 112GB of temp storage space and since my motherboard has two available SATA III sockets available, it was an obvious choice. The two SSD’s cost me $65 each and I even get a $10 rebate. I had to also buy a tray for the two drives so my cost at this point is $438. Still under budget!

Installing the drives was pretty easy and below are the results from Anvil’s Storage Utilities for the RAID Array.

Drive F: Raid 0 2x60GB Level 2 Cache.

drive_f_ssd

 

Not too bad when you compare it against what was originally being used for the temp work space. So basically I get 980+ MB’s of read speed and 840+ MB’s of write speed. Contrast that with 186 MB’s per second Read and 168 MB’s per second Write in the old configuration. That’s a 5x increase in Read and Write speeds.

I reran the benchmarks with the FancyCache set at 4GB’s and a Deferred Write of 10 seconds. Using the new SSD’s for temp work space, here are the benchmark results along with the other tests done previously.

Record Count

No Cache

mm:ss:hh

Level 1

4GB Cache

mm:ss:hh

RAID 0

SSD’s with Level 1 Cache

1 Million ( 500 MB) Real

17.823

10.66

10.5

CPU

14.118

14.695

14.2

2 Million ( 1 GB ) Real

33.751

21.189

21.5

CPU

28.204

28.828

29.3

4 Million ( 2 GB ) Real

59.888

43.672

44.4

CPU

57.611

58.141

59.4

8 Million ( 4 GB ) Real

02:08.7

01:35.4

1:32.7

CPU

01:58.2

02:00.2

1:59.5

16 Million ( 8 GB ) Real

06:59.6

07:04.7

4.16.4

CPU

04:05.5

04:06.3

4:07.6

32 Million ( 16 GB ) Real

25:38.7

25:45.3

10:0.6

CPU

09:10.6

09:26.5

8:41.1

64 Million (32 GB ) Real

57:31.7

56:04.8

23:49.2

CPU

19:51.9

18:52.9

19:11.5

 

As you can see in the chart above, as we moved into larger dataset sizes that are 16 million records ( 8 GB ) and larger, the SSD’s in the RAID 0 array really show their stuff. In the dataset sizes of 8 million records and less, the data basically sits in the 4GB cache the whole time so we don’t see any improvement in performance for those datasets.

The reduction in run times for the 32 million record dataset is 2.5 times and the 64 million record dataset shows a proportionate number. That is, the program was able to execute the test script 2.4 times faster in a Level 1 Cached SSD environment. These are pretty amazing numbers.

Although I didn’t achieve my goal of creating a machine that could execute a benchmark where Real Time was always less than CPU Time, I did get pretty close. There is still room for experimentation with block size and Level 1 Cache size to try to increase the performance of the machine, but the return on the time needed to do this would probably be miniscule.

Overall, this has been an enlightening experience for me. I’ve been able to take a pretty vanilla workstation and tune it using both software and hardware and show what $500 can do in terms of upgrading your hardware. Across the board, for every size dataset that I tend to process, I significantly reduced processing time when running WPS.

I’m going to go over the numbers a bit more in the coming days and I will post a round-up of my thoughts and present some justifications for what might be done going forward to improve performance for this workstation. After all, I still have $62 left in my budget.

Continue Reading: Analytic Workstations – Conclusions

About the author: Phil Rack is President of MineQuest, LLC. and has been a SAS language developer for more than 25 years. MineQuest provides WPS and SAS consulting and contract programming services and a reseller of WPS in North America.


Analytic Workstations – Part I

I’m often impressed at the amount of work done on run-of-the-mill corporate desktops and high performance workstations. When I talk to other WPS, SAS and R consultants, our discussions often tend towards the types of hardware we have experience with and what our own personal machines are made of in terms of hardware. It often is a lively debate with some “oohhhs and aahhhs” as we learn what each other is using for their businesses.

One thing I need to mention, and I don’t want to sound arrogant, is just how awful most corporate desktops are that are used by Quants and Data Managers. I’ve worked at a site that still had XP with only 512MB of RAM and a single 40GB hard drive for their data folks. It was truly awful trying to get anything done. On the other hand, I’ve worked with a few small businesses who have had pretty decent hardware and didn’t have a problem investing in upgrading desktop machines because they were (smart) and saw the ROI that it would bring to the company.

As a reseller of WPS I often get into a discussion about the product and their hardware. I’ve sent out evaluations on machines that had 24GB of RAM and 24 logical CPU’s for a personal workstation. I was rather impressed when I heard folks were using such hardware. That’s a rarity for sure, but as the need to process data faster and more data to process, higher end workstations are becoming the norm for many analysts.

Recently my desktop workstation’s motherboard died and after trying to find a replacement and seeing the cost, I decided it was time to buy some hardware that was faster and expandable. Time was of the essence but I also wanted to configure a machine that was a real workhorse for my daily development duties. This blog post is about picking and choosing some very specific hardware, the rationale behind my choices, configuration and some bench marks thrown in for good measure.

Most of the work I do is WPS and SAS development and testing of code against datasets that are moderate in size. I would venture to guess that the vast majority of my datasets are between one million and four million records with 50 to 70 variables per dataset. In terms of storage, that works out to be between 500MB and 2GB per dataset. Although not common, I have datasets that do range upwards of eight million records and approximately 4GB in size. Given that metric, I wanted my new hardware to be able to handle datasets in the 1 to 4 million record size as a matter of routine.

As many WPS and SAS developers are quick to admit, the Achilles heel for running datasets of that size is I/O. Many modern CPU’s are able to crunch data quite well, but getting that data off the disk and into memory is the key objective when processing WPS datasets.

So what are my measurable goals?

Keep hardware and software cost under $500 by using the existing case, power supply and hard drives.

Be able to process 4 million records where real time is lower than CPU time. Ideally be able to process 8 million records using this metric, but that may be tough.

Create an I/O subsystem for the WPS work library that is highly tuned to processing WPS datasets.

Hardware

  • Intel I5 Processor at 3.4GHz
  • 16GB of Dual Channel Ram (4×4)
  • ASUS Motherboard with 4 6GB SATA and 4 3GB SATA ports.
  • WD Caviar Black 640GB hard drive x 2 in RAID 0 mode for Work.
  • Seagate 320GB x 4 in RAID 0+1 mode for C Drive.

Total outlay for the motherboard, memory and CPU was $318.00. The drives listed above were part of the Workstation that was being upgraded.

Establishing a Baseline

It’s important that we establish a baseline for measuring what we are using in terms of hardware. Below are two charts that show both drive arrays for our Work Station.

C: Drive – hosts the OS, WPS and permanent WPS datasets.

drive_c_baseline

D: Drive – WPS Work Space

Not the fastest hard drives but it is what we have to work with at the moment. So essentially the benchmarks reveal the following performance statistics. One thing to note from the charts above is that the IOPS are really pathetic.

Drive

Read

Write

Total

C

238.34 MB/s

243.59 MB/s

481.94 MB/s

D

186.64 MB/s

168.91 MB/s

355.55 MB/s

We wrote a benchmark program that reads two permanent datasets from the C drive, combines into a temporary dataset and does a few calculations on them. From there on, all of the processing takes place using that temporary dataset. Below is the benchmark results for seven different sized datasets.

Record Count Time

mm:ss:hh

1 Million ( 500 MB) Real

17.823

CPU

14.118

2 Million ( 1 GB ) Real

33.751

CPU

28.204

4 Million ( 2 GB ) Real

59.888

CPU

57.611

8 Million ( 4 GB ) Real

02:08.7

CPU

01:58.2

16 Million ( 8 GB ) Real

06:59.6

CPU

04:05.5

32 Million ( 16 GB ) Real

25:38.7

CPU

09:10.6

64 Million (32 GB ) Real

57:31.7

CPU

19:51.9

The benchmarks above are disappointing to me. Not once did we achieve our stated goal of processing any of the datasets where Real time is less than CPU time. I’m actually surprised by this as I expected the OS to be able to buffer at least the smallest dataset of 1 million records (500 MB) allowing for this metric to be accomplished. Also, as a note, for the datasets of 32 and 64 million records, we had to set the Memsize to 9G, that is 9 Gigabytes of memory available to the WPS System. We had to do this so that PROC SORT would not start using virtual memory and spend all of its time paging.

In the next post Analytic Workstations Part II, I’m going to start to tune the I/O of the workstation to see if we can reduce Real processing time and achieve the goal of Real Time being lower than CPU time.

Continue to: Analytic Workstations – Part II

About the author: Phil Rack is President of MineQuest, LLC. and has been a SAS language developer for more than 25 years. MineQuest provides WPS and SAS consulting and contract programming services and a reseller of WPS in North America.

What We Use: Our Favorite Hardware Gear

I love to hear what other people and organizations are using in terms of hardware and software to not only run their business, but as personal productivity enhancements. I thought it would be fun to share what we have here in house for development. This particular post will focus on the hardware side.

Custom Linux Server – 16GB of RAM and 4 Logical CPU’s. This box is the basis for our WPS Development and Testing environment for the Linux platform. Lots of work space (three 640GB SATA III drives in RAID0) for temp work and four 640GB drives in RAID5 for permanent data sets.

Windows Server 2008 R2 – A four core CPU with 8GB of RAM with two arrays each having 2×1.5TB drives in a RAID1 array. There’s another 4TB of assorted space for business data. This box is used mainly for testing WPS Windows Apps and for backing up all the various desktops. It also provides us with remote connectivity when we are on the road.

Desktops – a number of assorted desktops from various manufacturers. All have at least 8GB of RAM and a couple terabytes of storage. All desktops have at least four cores so performance is decent.

Apple Mac Mini – This is a recent purchase. It’s a four core Intel i5 CPU and the box (if you want to call it that) had 2GB of RAM. I immediately upgraded it to 8GB but think it might be time to go to 16GB since memory is so inexpensive. The latest pricing for two 8GB sticks that would work in that machine is about $160.

All the desktops have dual monitors, even the Mac Mini. Btw, if you need to buy an adapter for the mini-Display Port for the 2nd monitor, check out the prices at MonoPrice.com. I paid less than $7.00 USD for an adapter to output to DVI.  Apple wanted $29.00 for it!

A couple of notebook computers. Both are dual core and are great for traveling but doing any hardcore development on them would be painful.

Printers – We just recently gave a Tektronix Phaser color printer to charity. In house we have a Canon MFC 4150 for everyday printing and a Lexmark 543DN color laser. Both are wonderful printers but the next one we buy will have to be wireless. The Canon just doesn’t support printing and scanning on the Apple OS X operating system.

Of course, we have the assorted headsets and telephone systems to compliment the business requirements that we have. It’s amazing how quickly one can load up on junk hardware. It’s very hard for me to throw or give older equipment away… it’s the packrat in me.

About the author: Phil Rack is President of MineQuest, LLC. and has been a SAS language developer for more than 25 years. MineQuest provides WPS and SAS consulting and contract programming services and a reseller of WPS in North America.


Add a Submit Batch Menu to Windows Explorer

One of the things that I find fascinating as a consultant is how different consultants and software developers design and implement their work environments. For the most part, when I go onsite, I almost always keep their vanilla setup just as they have implemented it. For my notebook and personal workstations, I customize them quite heavily.

I think customization of workstations/desktops (and workplace environments too) is fairly common for most hard-core programmers and “Rock-Star” developers. These folks tend to develop tools so they can gain efficiencies in their day-to-day tasks. Over the years, I’ve seen some amazing customizations and I try to borrow their ideas just as much as possible.

A hallmark of a great WPS/SAS developer in my mind is to be able to manage multiple tasks. Some of us can multi-task three or four projects at a time where as some others can only multi-task sequentially. I always liked that phrase and have waited years to use it in a blog! But it’s true! The easier it is to perform some task, the easier it is to automate it and gain efficiencies by running multiple tasks or jobs at the same time.

Today, I want to focus on WPS. In a WPS environment using the Eclipse Workbench, you can view logs and listings and edit multiple WPS programs while a program is running, but you are limited to running a single program at a time from within the Workbench. I know many developers, and count me in as one of these folks, who like to run tasks in the background such as a long running WPS program while I’m doing some editing in the Workbench.

In the SAS environment on the Windows Desktop, you can run a SAS program in batch by right clicking on a .SAS program and selecting “Run SAS Batch” or something similar to that. That’s something I’ve liked and unfortunately, it’s missing in the current WPS environment. So, I’ve put together some simple instructions that shows you how you can do the same thing.

In a previous blog post, I provided a windows CMD file that ran WPS in batch. I slightly modified this CMD file and named it “Submit WPS Batch.CMD” and you can find a link to it at the bottom of this post. This CMD file will run a WPS program in batch and you can easily add a Windows Explorer menu item so that when you right click on a program that has the .SAS extension, you can run this program in batch mode.

Below is the Windows Explorer Window with the Submit WPS Batch menu item added.

image

The first thing you have to do to get this working is to download the submitbatch.zip file and unzip it to a file folder such as c:\temp or some other folder.

Step 2 is to go to a folder that has an existing WPS program and right click on a file that has the .SAS extension and select “properties” from the context menu. Click on “properties” and then select the “Change” button. The “Open With” selection box will open.

clip_image002

Click on “Browse” and navigate to the folder that you saved the Submit WPS Batch.CMD file and select that program and then click OK. You will see that the properties box for the program will have the following attributes. Type of file: SAS File (.sas) and Opens with: Submit WPS Batch.

clip_image002[10]

The only other modification you need to do is edit the Submit WPS Batch.cmd. Make sure line 12

SET wpsloc=c:\Program Files\World Programming WPS 2

points to the proper folder where WPSI.EXE is located.

That’s all there is to adding a menu item to Windows Explorer so that you can submit a WPS program to run in batch in the background and have the log and lst files be placed in the folder where the submitted program resides.

Link: Submit_WPS_Batch.zip (1,488 bytes)

About the author: Phil Rack is President of MineQuest, LLC. and has been a SAS language developer for more than 25 years. MineQuest provides WPS and SAS consulting and contract programming services and a reseller of WPS in North America.