Category Archives: High-Performance

Rapidly Deploy Business Systems – Introducing FRONDS

AmpUp logoDataPlex’s AmpUp “Rapid Enterprise Deployment” software platform is being used to implement “million dollar systems” at a fraction of the cost. Each system is designed to enhance an organization’s already proven workflow, further leverage its business intelligence, and make it more productive and profitable. DataPlex AmpUp systems are 100% web-based and come already mobile and tablet enabled for access from anywhere. Example of already deployed systems are:

  • An Electronic Field Reporting System used by police officers to file incident report more quickly and conveniently
  • A Parking Management System that processes parking lot data to provide real-time dashboards with management-oriented charts and reports
  • PlexChex, a web service that constantly monitors and sends alerts based on website and business system performance
  • A Patient Management System with advanced EMR support to reduce the staff workload of medical groups
  • A new Fundraising Organization Network Database System (FRONDS) helps non-profits be more efficient at donor development and membership management
AmpUp-Based FRONDS Now Live at Descanso Gardens

FRONDS logoThe most recent AmpUp-based enterprise system developed by DataPlex is its Fundraising Organization Networked Data System, known as FRONDS, which is now live at Descanso Gardens and enabling their staff to more effectively interact with its membership of 18,000 people. The “back office” of development where all donations are managed went live on June 1st and the “front office” of the Visitor Center with membership and ticket sales went live on July 1st.

Intersection of Enterprise, Mobile and CloudNot only did DataPlex engineers design a custom version of FRONDS for Descanso Gardens called the Descanso Gardens Enterprise System (DGES) but they are hosting it on the Amazon’s high-performance EC2 cloud. DataPlex has become truly a one-stop shop that can work with an organization all the way from concept and design through development and hosting, and offers a suite of already developed modules for integrating enterprise, mobile and the cloud.

FRONDS is based on DataPlex’s AmpUp rapid development enterprise software platform that reduces the amount of time and cost to get a state-of-the-art web-based platform up and running. Many of the standard AmpUp features are available such as flexible searching, Excel/OpenOffice exporting, mail merging and consolidated “one button” month end reporting.

Some of the more interesting capabilities of FRONDS are:

  • Real-time dashboard display of daily activity
  • Print and scan ID cards
  • Point-of-sale (POS) operations: ticket, course and membership sales
  • Automatic letter generation with mail-merge capabilities
  • Track donor gifts by solicitations and campaigns
  • Donor communications Journal
  • Calendar of events
  • Budgets and Actuals: trend analysis and forecasting
  • Accounting Reports organized by General Ledger codes
  • Single-button Month-End Reporting
  • Multitude of workflow reports, document generation and Excel export

Descanso Gardens sample ID card

Descanso Gardens ID card

FRONDS prints membership cards, each one with different information including member name, ID number and an associated scancode. The Descanso Gardens staff decided to print all the cards themselves rather than use a commercial printer. With two printers set up, each of which prints one card every 7 seconds, the entire print job was under 20 hours. While printing, the staff was able to test scan the cards for quality control purposes so as to be sure to keep to their aggressive timeline. Now that the cards have been printed, the staff only has to print a couple thousand cards a month, a comparatively easy task, and has the option to offload some the ID card printing to the Visitor Center when people who sign up can get their ID card instantly.

The system dashboard shows the staff the current weather conditions and real-time visitation and sales figures, and compares them to the same day of the week of the previous year.

FRONDS Dashboard

FRONDS Dashboard

FRONDS also has an interactive POS cash register that is 100% web-based and compatible with all major web browsers including those for iPads and other tablets and which is also touch screen and barcode scanner enabled.

FRONDS Point-of-Sale Screen

FRONDS Point-of-Sale Screen


Descanso Gardens executives report how FRONDS is already proving its value:

  • Instantaneous reporting that will help us manage our enterprise better
  • Tracking member behavior, which improves our knowledge of a key, core constituency
  • Integrates POS with membership management
  • Provides us with much more powerful tools for member fulfillment and satisfaction:
    • instant enrollment and addition to the database
    • near-instant fulfillment of membership cards
    • much faster renewal process (soon to be also web-based)
    • better “member experience” at check-in
    • much better control of entrance gate
  • Much simplified and more user-friendly donor tracking, moves management, reporting system that is also much less expensive than the current systems in wide use such as Blackbaud’s Raiser’s Edge and DonorPerfect

Even though they had been using a different commercially available web-based donor management package, they will still realize a 25% savings in overall work effort by using FRONDS which aligns more closely with their fundraising and membership management requirements.

DataPlex FRONDS in the Descanso Gardens Visitor Center

FRONDS in the Descanso Gardens Visitor Center

For more information about FRONDS, AmpUp, one of the other AmpUp systems, or to discuss without obligation what DataPlex technical services can do for you, please feel free to contact us anytime.

What to do about Bots that kill AWS Micro Instances running WordPress

For one of our customers, we leveraged WordPress and its powerful capabilities to create a rather large website consisted of hundred of pages. Because the expected traffic is to be low, we installed their site on a economical AWS Micro Instance which performed well. In the middle of last night, however, the instance’s CPU utilization percentage hit 100% for nearly one hour. Anyone else accessing the site during this period would have had a sluggish if not unresponsive experience.

AWS Micro Instances are great for testing and deploying simple websites that, by their design and market, won’t be required to work very hard. That said, websites are at the mercy of whoever accesses them from the rest of the Internet. Too many accesses during too short of period can overrun the resource allotment of a Micro Instance.

In our investigation, we discovered that a bot called Aboundexbot was the culprit. The Aboundexbot bot wanted to crawl the entire site and quickly at that, an act which threw the CPU Utilization to 100% because AWS micro instances, as their name implies, are limited to a certain amount of CPU activity per unit of time. Unfortuately, Aboundexbot did not throttle it’s access as do other better behaved bots, and it apparently does not have a built-in mechanism (such as a timeout) to detect when it may be overtaxing a site.

In any case, we decided that we just didn’t want Aboundexbot and perhaps some other badly behaved bot to visit our customer’s site so as the keep the site performing well. Our thought was to add a corresponding “disallow” entry to the “robots.txt” file. However, whereas this is a simple task for a regular website, it is more challenging for a WordPress-based site if it has been installed in the domain root. In that case, all of the site’s root file access go through WordPress’s dynamic page generation, including access to the theoretical “robot.txt” file.

In the WordPress wp-includes/ folder, there is a file called functions.php in which there is a function called do_robots() which dynamically creates a “robot.txt” file on demand. But it’s not very sophisticated, allowing for just two types of output depending on the Site Visibility setting under WordPress’ Dashboard > Settings > Privacy page.

We could have added a plug-in that provided finer robot.txts control, and we still might do that, but to get a solution in quickly, we decided to simply enhance the do_robots() function as follows (our code addition in boldface):

function do_robots() {
  header( 'Content-Type: text/plain; charset=utf-8' );

  do_action( 'do_robotstxt' );

  $output = "User-agent: *\n";
  $public = get_option( 'blog_public' );
  if ( '0' == $public ) {
    $output .= "Disallow: /\n";
  } else {
    $site_url = parse_url( site_url() );
    $path = ( !empty( $site_url['path'] ) ) ? $site_url['path'] : '';
    $output .= "Disallow: $path/wp-admin/\n";
    $output .= "Disallow: $path/wp-includes/\n";
    $fbotmore = file_get_contents('./wp-content/robots.txt');
    if ($fbotmore !== false) $output .= $fbotmore;
  }

  echo apply_filters('robots_txt', $output, $public);
}

We are currently on WordPress version 3.3.1. Because different versions of WordPress may have different code for this function, use your programming know-how to add the above two boldfaced lines to the function in the most appropriate way. Note that this is not a permanent change as any significant WordPress upgrade will overwrite this change in the functions.php file.

We then located a list of other badly behaved bots and installed our collective list in the wp-contents/robot.txt file which is now included whenever our domain.com/robot.txt file is accessed.

For your reference, here is what we came up with for the contents of our robots.txt file. Note that WordPress has a few entries of its own that are placed in advance of this content.

User-agent: Aboundexbot
Disallow: /
User-agent: NPBot
Disallow: /
User-agent: TurnitinBot
Disallow: /
User-agent: EmailCollector
Disallow: /
User-agent: EmailWolf
Disallow: /
User-agent: CopyRightCheck
Disallow: /
User-agent: Black Hole
Disallow: /
User-agent: Titan
Disallow: /
User-agent: NetMechanic
Disallow: /
User-agent: CherryPicker
Disallow: /
User-agent: EmailSiphon
Disallow: /
User-agent: WebBandit
Disallow: /
User-agent: Crescent
Disallow: /
User-agent: NICErsPRO
Disallow: /
User-agent: SiteSnagger
Disallow: /
User-agent: ProWebWalker
Disallow: /
User-agent: CheeseBot
Disallow: /
User-agent: ia_archiver
Disallow: /
User-agent: ia_archiver/1.6
Disallow: /
User-agent: Teleport
Disallow: /
User-agent: TeleportPro
Disallow: /
User-agent: Wget
Disallow: /
User-agent: MIIxpc
Disallow: /
User-agent: Telesoft
Disallow: /
User-agent: Website Quester
Disallow: /
User-agent: WebZip
Disallow: /
User-agent: moget/2.1
Disallow: /
User-agent: WebZip/4.0
Disallow: /
User-agent: Mister PiX
Disallow: /
User-agent: WebStripper
Disallow: /
User-agent: WebSauger
Disallow: /
User-agent: WebCopier
Disallow: /
User-agent: NetAnts
Disallow: /
User-agent: WebAuto
Disallow: /
User-agent: TheNomad
Disallow: /
User-agent: WWW-Collector-E
Disallow: /
User-agent: RMA
Disallow: /
User-agent: libWeb/clsHTTP
Disallow: /
User-agent: asterias
Disallow: /
User-agent: httplib
Disallow: /
User-agent: turingos
Disallow: /
User-agent: spanner
Disallow: /
User-agent: InfoNaviRobot
Disallow: /
User-agent: Harvest/1.5
Disallow: /
User-agent: Bullseye/1.0
Disallow: /
User-agent: Mozilla/4.0 (compatible; BullsEye; Windows 95)
Disallow: /
User-agent: Crescent Internet ToolPak HTTP OLE Control v.1.0
Disallow: /
User-agent: CherryPickerSE/1.0
Disallow: /
User-agent: CherryPickerElite/1.0
Disallow: /
User-agent: WebBandit/3.50
Disallow: /
User-agent: DittoSpyder
Disallow: /
User-agent: SpankBot
Disallow: /
User-agent: BotALot
Disallow: /
User-agent: lwp-trivial/1.34
Disallow: /
User-agent: lwp-trivial
Disallow: /
User-agent: Wget/1.6
Disallow: /
User-agent: BunnySlippers
Disallow: /
User-agent: URLy Warning
Disallow: /
User-agent: Wget/1.5.3
Disallow: /
User-agent: LinkWalker
Disallow: /
User-agent: cosmos
Disallow: /
User-agent: moget
Disallow: /
User-agent: hloader
Disallow: /
User-agent: humanlinks
Disallow: /
User-agent: LinkextractorPro
Disallow: /
User-agent: Offline Explorer
Disallow: /
User-agent: Mata Hari
Disallow: /
User-agent: LexiBot
Disallow: /
User-agent: Web Image Collector
Disallow: /
User-agent: The Intraformant
Disallow: /
User-agent: True_Robot/1.0
Disallow: /
User-agent: True_Robot
Disallow: /
User-agent: BlowFish/1.0
Disallow: /
User-agent: JennyBot
Disallow: /
User-agent: MIIxpc/4.2
Disallow: /
User-agent: BuiltBotTough
Disallow: /
User-agent: ProPowerBot/2.14
Disallow: /
User-agent: BackDoorBot/1.0
Disallow: /
User-agent: toCrawl/UrlDispatcher
Disallow: /
User-agent: WebEnhancer
Disallow: /
User-agent: TightTwatBot
Disallow: /
User-agent: suzuran
Disallow: /
User-agent: VCI WebViewer VCI WebViewer Win32
Disallow: /
User-agent: VCI
Disallow: /
User-agent: Szukacz/1.4
Disallow: /
User-agent: QueryN Metasearch
Disallow: /
User-agent: Openfind data gathere
Disallow: /
User-agent: Openfind
Disallow: /
User-agent: Xenu's Link Sleuth 1.1c
Disallow: /
User-agent: Xenu's
Disallow: /
User-agent: Zeus
Disallow: /
User-agent: RepoMonkey Bait & Tackle/v1.01
Disallow: /
User-agent: RepoMonkey
Disallow: /
User-agent: Zeus 32297 Webster Pro V2.9 Win32
Disallow: /
User-agent: Webster Pro
Disallow: /
User-agent: EroCrawler
Disallow: /
User-agent: LinkScan/8.1a Unix
Disallow: /
User-agent: Kenjin Spider
Disallow: /
User-agent: Keyword Density/0.9
Disallow: /
User-agent: Cegbfeieh
Disallow: /
User-agent: SurveyBot
Disallow: /
User-agent: duggmirror
Disallow: /

To test your change to the do_robots(), just access from your favorite browser your domain.com/robot.txt file. Did it work? Let us know.

Hopefully, this change will keep the micro instance from being overtaxed by zealous bots. If you get a bot that simply ignores the robot.txt file, you may have to resort to adding a “deny from” entry in your server configuration, but in our experience we haven’t seem many of those.

DataPlex welcomes Mr. David Remba

We are thrilled to announce that Mr. David Remba, a high-performance computing, graphics and imaging expert previously of Walt Disney Animation, Kodak Cinesite Hollywood and Northrop Grumman, has joined DataPlex as one of its principal team members. Find out more about Mr. Remba’s experience at:

http://www.dataplex.com/company/leadership.php

UCLA Magazine, Premier Issue

DataPlex engineers were hired by a small, fledgling company known as “3D Systems” that had a lab curiosity that they needed to turn into a product.  DataPlex principals Harry Tarnoff, Warren Juran, Stuart Spence, Richard Harlow and David Remba all contributed to getting 3D Systems’ initial “stereolithography” products designed, built and out into the hands of customers, customers which were the first of what are now known as the “rapid prototyping” and “3D printing” markets.  Read all about it here.

As part of the design effort, it was DataPlex CEO Harry Tarnoff that, against many odds, pushed stereolithography for medical applications.  The result was the use of CAT scans to build models of human anatomy, get more accurate medical results while at the same time reducing the need for surgery. The premier issue of UCLA Magazine featured one such medical application:

UCLA Magazine, Vol. 1 No. 1
Click cover for a larger view.

The caption for the cover read as follows:

Computer generated, three-dimensional models developed by N.J. Mankovich, Ph.D., UCLA Radiology, surround Thomas Faraguna, whose fractured skull was replaced by a prosthesis designed with the aid of such tools… As example of tomorrow’s high-tech, whereby an exact duplicate of a patient’s bone is created by computer-guided laser, thus insuring an exact prothesis fit before surgery.