How I may help
LinkedIn Profile Email me!
Call me using Skype client on your machine

Reload this page Multi-Tier Reference Apps for Benchmarking

This series lists the benchmarking applications available, providing an analysis of the installation and operation of key benchmark apps.

 

RSS XML feed for load testers RSS preview for load testers Site Map List all pages on this site 
About this site About this site 
Go to first topic Go to Bottom of this page


Set screen Summary

 
Go to Top of this page.
Previous topic this page
Next topic this page

Set screen Controversy Timeline

    June 2001 Oracle Pet Store

    Oracle publishes a benchmark on the performance of Sun’s Java Pet Store blueprint application on this page running on the Oracle 9i Application Server.

    November 2001 Microsoft C# Port of Petstore

    Microsoft announced a benchmark comparing the performance of Sun’s Java Pet Store blueprint application running on the Oracle 9i Application Server against the performance of a Microsoft "port" of the Java Pet Store to C# for the .NET platform.

    webpage article Dean Wampler's "Cat Fight in a Pet Store" 11/28/2001

    March 2002 Oracle

    In March 2002, Oracle published pdf this benchmarking study claiming that their revised implementation of the Sun Java Pet Store 1.1.2 is "10 times faster than Microsoft .NET".

    May 2002 Microsoft

    So in May, 2002, Microsoft hired VeriTest (a supposedly "independent" lab which conducts tests to certify Windows) invited Oracle to participate in an head-to-head re-test, which Oracle declined to participate. So VeriTest repeated Oracle's tests using Mercury LoadRunneranother page on this site test scripts Oracle published on their Web site.

    Veritest raised questions about Oracle's published data noted "serious flaws, including missing application functionality and questionable test script settings." Using their own settings, Veritest then pdf reported that Microsoft .NET is "10 times faster" than Oracle's app server.

    June 2003 "Draw" Declaration

    The performance of .NET 1.0 and it J2EE rival was declared about the same in a June 2003 benchmark report by the Middleware Company which compared Microsoft's .NET against its J2EE competitors. The J2EE competitors refused to be named, probably because the earlier version of the benchmark test declared .NET the winner.

    Sep 2005 BEA MedRec-Spring

    BEA's The MedRec rewritten to use the Spring framework and J2EE.

    Where's the Microsoft .NET 2.0 version that can be used for comparison now? Perhaps one using Iron Speed?

    May 12, 2006 Jave EE 5 AJAX-based Petshop

    Sun releases the Java Pet Store 2.0 (jPetStore) Reference Application early access application to illustrate use of the Java EE 5 platform to design and develop an AJAX-enabled Web 2.0 application. The application comes with full source-code available under a BSD-style license. [download 537.3kb]. Adrian Lanning's 01-27-04 Setting up your Windows computer to run JPetStore 3.x with MySQL and Tomcat

    Sprint Pet Clinic Sample App (136.4kb).

 
Go to Top of this page.
Previous topic this page
Next topic this page

Set screen Sun's BluePrints

 
Go to Top of this page.
Previous topic this page
Next topic this page

Set screen Microsoft's Reference Benchmark Apps

    Microsoft has developed several reference benchmark apps as Architectural Sample Applications.

    Its own reference application implemented in both Microsoft Visual C# and Microsoft Visual Basic .NET, webpage article version 7 of Duwamish for .NET

    Set screen Microsoft Petshop.NET 3.0 Installation

    download Microsoft's Petshop 3.0 .NET Sample Application downloads file PetShop 3.0 Installer.msi of 771KB with properties of being published 5/23/2003). Invoking this file extracts file Petshop3.msi.

    • installs by default into folder C:\Program Files\Microsoft\PetShop\
    • creates databases "MSPetShop" and "MSPetShopOrders"
    • creates in IIS virtual directory "MSPetShop"
    • creates COM+ application ".NET Pet Shop"

    provides the code behind the webpage article October 2002 Using .NET to Implement Sun Microsystems' Java Pet Store J2EE BluePrint Application and webpage article May 2003 Design Patterns and Architecture of the .NET Pet Shop (created by Gregory Leake of Microsoft and James Duff of Vertigo Systems) for use in application benchmarks comparing the performance and scalability of .NET Web applications to the performance of an equivalent, revised, and fully optimized J2EE application.

    Micorsoft developed PetShop.NET to provide a direct comparison against Sun's Pet Shop, then at Version 1.1.2: (For J2EE SDK 1.2.1), available among "Guidelines, Patterns, and code for end-to-end Java applications"

    However, Sami Jaber argues in his insightful November 2002 article PetShop.NET [2.0]: An Anti-Pattern Architecture (translated from the original French) that PetShop.NET is an anti-pattern (a mistaken solution) because its single namespace (PetShop.Components), has methods such as GetProductsBySearch() containing SQL statements in service layer code rather than in separate data (DAO) layer code, as with Java PetShop. "If the application must evolve/move from the thin client (ASPX) to the fat client (WinForms), no change in the service layer should be necessary."

      " The design used by PetShop.NET is a hybrid mixture of an service object, persistent object and DAO. This approach of course reduces the number of code lines dramatically, but may be inappropriate"

    For this reason, Jaber, through his DotNetGuru.com community aims to rewrite PetShop.NET to use a true N-tier architecture for greater agility, using an Abstract Factory pattern between layers and "Remoting/WebService/Local calls in the service layer and real O/R Mapping tool or DAO in the Data tier just by changing configuration file." This is so it can be a true .NET Best Practice sample.

 
Go to Top of this page.
Previous topic this page
Next topic this page

 
Go to Top of this page.
Previous topic this page
Next topic this page

Set screen App Server Software Product Benchmarks

    IBM has its own Commercial Performance Workload (CPW).

    Set screen SAP Benchmarks

    SAP AG provides its Standard Application Benchmarks to compare the performance of various hardware and database choices.

    SAP's benchmarking procedure is standardized and well defined. It is monitored by the SAP Benchmark Council made up of (since 1995) representatives of SAP and technology partners involved in benchmarking. Originally introduced to strengthen quality assurance, the SAP Standard Application Benchmarks can also be used to test and verify scalability, concurrency, and multi-user behavior of system software components, RDBMS, and business applications.

    The Council created a hardware-independent throughput measurement metric called SAPS (SAP Application Performance Standard), where 100 SAPS is defined as 2,000 "fully business processed" standard Sales and Distribution (SD) order line items per hour. Since each sales order contains 5 line items, 2,000 / 5 = 400 repetitions are performed per benchmark hour. During each hour, 2,000 + 400 = 2,400 SAP transactions in each baseline hour. This means 2,400 / 60 = 40 SAP transactions per minute or 40 / 60 = 0.67 transactions per second.

    Each repetition in SAP-SD requires 15 posting dialog steps (screen changes) between login and logoff.

    Repeat Input Screen Change
    (LR Transaction Name)
    T DB
    Use
    Notes
    n/a click to view screen Configuration - - -
    Once for Init. (Entry) click to view screen I01_InvokeSAPGUI - - -
    click to view screen I02_Logon_SAP - - -
    click to view screen I03_EasyAccess_menu - - -
    (15 posting steps) click to view screen R01_VA01_Invoke (Create Sales Order) VA01 - -
    click to view screen R01a_Order_Type_Lookup - 30% -
    click to view screen R01b_Sales_Org_Lookup - 30% -
    click to view screen R01c_Dist_Channel_Lookup - 30% -
    click to view screen R01d_Sales_Division_Lookup - 30% -
    click to view screen R01e_Sales_Office_Lookup - 30% -
    click to view screen R01f_Sales_Group_Lookup - 30% -
    click to view screen R02_Second_screen_5_items - - -
    - R03_Second_screen_Save - - -
    click to view screen R04_VL01N_invoke (Create Outbound Delivery with Order Reference) VL01N - -
    - R05_First_screen - - -
    click to view screen R06_First_screen_Save - - -
    click to view screen R07_VA03_invoke (Display Sales Order: Initial Screen) VA03 - -
    click to view screen R08_Display_customer_order_Enter - - -
    click to view screen R09_VL02N_invoke (Change Outbound Delivery) VL02N - -
    - R10_Choose_[F20]_Post_goods_issue - - -
    click to view screen R11_VA05 (List of Sales Orders) VA05 - -
    - R12_List_orders_Enter - - -
    - R13_VF01_invoke (Create_invoice) VF01 - -
    - R14_Create_invoice_Save - - -
    Once to Exit - X01_Call_end - - -
    - X02_log_off - - -
    click to view screen X03_log_off_Confirm - - -

    So 400 x 15 = 6,000 posting dialog steps are processed per baseline hour (excluding entry and exit steps, which are not counted in the metric).

    With 10 seconds of user "think time" between user actions and an average 2 second response time, a baseline benchmark run takes 12 seconds x 6,000 steps = 72,000 seconds or 72,000 / 360 sec./hr. = 200 user hours, which means 200 concurrent active users in a given hour.

    My assumption is that all users are logged in and actively working (although realistically there should be some).

    Will faster servers or more RAM enable the environment to support more simultaneous users and process more transactions per hour by a corresponding larger number of emulated users?

    On May 24, 2007, Oracle announced that its 10g RAC database running SAP ERP 2005 (2-tier) on IBM AIX machines with two dual-core 4.7 GHz POWER6 processors achieved 20,120 SAPS by generating 402,330 fully processed order line items/hour using 1,207,000 posting dialog steps/hour. 4,010 users were emulated, so each user generated 301 posting dialog steps per hour or 301 / 60 = 5 steps per minute, which is 60 / 6 = one every 12 seconds, since the average dialog response time was 1.96 seconds — 10 seconds is added to emulate user "think time" between each step.

    These results were optained with the CPU at or near 100% utilization, so these numbers should be considered maximum possible values.

    Results for Sales and Distribution (SD), run by IBM in Beaverton, OR, using these configuration defaults, are:

      2-tier, where a single central server runs the database and all application instances.
      3-tier, where a single central server accesses the database on another server.
      3-tier parallel, where several central servers accesses the database on another server, with users equally distributed across all database nodes (using a round-robin-method).

    The test transactions access the main tables of the SAP Sales & Distribution SAP application (SD):

    1. LIPS (Delivery note detail)
    2. VBAP (Item detail)
    3. VBRP (Billing detail)
    4. STXL (text file detail)

    Combining the SAPS metric with the SPECInt benchmark rating for the server hardware used for the test enables capacity comparisons and estimation.

    Although not all SAP modules (applications)another page on this site are modeled,
    SAP's IDES (Internet Demonstration and Evaluation System) contains master, transaction, and customizing data for real-life business scenarios that can be run in the SAP System of integrated Financials, HR, and Logistics applications for a model international company with subsidiaries in several countries producing products using several processes (engineer-to-order, make-to-order, and mass production; repetitive and process chemical/pharmaceuticals manufacturing, retail).

    With each major product release SAP updates IDES for download from the SAP Service Marketplace (with a valid SAP S-User ID). IDES is used in training exercises and SAP's system demonstration portal.

 
Go to Top of this page.
Next topic this page

Set screen Oracle (BEA) Weblogic Benchmarks

    MedRec (Medical Records) simulates an independent, centralized medical record management system providing a framework for patients, doctors (physicians), and administrators to manage patient data. Avitek

    Medrec is written using Java J2EE Swing running on WebLogic server software to demonstrate WebLogic server features and recommended best practices. that Oracle acquired when it bought BEA.

    Set screen Download and Install

    The installer for MedRec is within the Download the OracleATS (Application Test Suite) download download

    1. cd \
    2. cd C:\bea\WEBLOG~1\samples\server\medrec\setup\config
    3. setMedRecEnv.cmd
      This sets (replaces) environment variables
        WL_HOME=C:\bea\weblogic81
        PRODUCTION_MODE=@PRODUCTION_MODE
        JAVA_VENDOR=BEA
        JAVA_HOME=C:\bea\jrockit81sp5_142_08
        SAMPLES_HOME=C:\bea\weblogic81\samples
        MEDREC_SRC=%SAMPLES_HOME%\server\src\medrec\src
        CLASSPATH to add rt.jar, webservices.jar, ejbgen.jar, ojdbc14.jar
    4. cd C:\bea\WEBLOG~1\samples\server\medrec\src
    5. ant deploy.dev

    webpage article Create the MedRec domain and MedRec server.

    Set screen Running MedRec

    On Windows machines, MedRec can be accessed from the Windows Start > Programs > BEA WebLogic Platform 8.1 > Examples > Weblogic Server Examples > Medical Records Medrec > Launch Medical Records.

      This invokes C:\bea\weblogic81\common\bin\console.exe /k "C:\bea\weblogic81\samples\domains\medrec\startMedRecServer.cmd"

    1. PointBase JMS JDBC store listening on port 9092 is started automatically in another command window.

      The default Weblogic username and password is "weblogic".

    2. Node Manager is started from
        folder C:\bea\weblogic81\server\bin
        file startNodeManager.cmd
    3. An internet browser appears with screen captured this page. Caution! It takes over whatever IE window is open.

      At the upper right corner note the webpage article tutorial

      Clicking on "Start" invokes http://localhost:7001/start.jsp shown below:

    On Linux and other platforms, start MedRec from the WL_HOME\samples\domains\medrec directory, where WL_HOME is the top-level installation directory for the WebLogic Platform.

    another page on this site My breakdown of login and other pages offered by the BEA MedRec application.

    Running Example BEA App

 
Go to Top of this page.
Previous topic this page
Next topic this page

    Set screen Medrec Architecture

    Medrec J2EE MedRec includes a service tier of Enterprise Java Beans (EJBs) that work together to process requests from client applications in the presentation tier and from Web applications, Web services, workflow applications. The application includes message-driven, stateless session, stateful session, and entity EJBs.

    The MedRec physician web app should be housed in a different server/domain accessing the database managed by the patient and admin web apps on their own domain.

    Enter the http://127.0.0.1:7001/console as user "weblogic" as username and password.

      From JAVA_OPTIONS in startManagedWebLogic.cmd within C:\bea\user_projects\domains\medrec
        -Dcom.bea.medrec.xml.incoming=incoming
        -Dphys.app.wsdl.url=http://localhost:7001/ws_medrec/MedRecWebServices?WSDL

    At the presentation layer MedRed apps uses JavaServer Pages (JSPs) tags and Jakarta Struts 1.0 intelligence populating Enterprise Java Beans that request Actions within the service tier.

    1. Login HTML posts to Login.do.
    2. Web container detects *.do pattern in the web.xml deployment descriptor and redirects to
    3. Jakarta Struts 1.0 Action Servlet, which
    4. dissects the incoming URL and based on the action-mapping in struts-config.xml
    5. redirect to the Login Action controller servlet (which extends the Base Action) processing javabeans needed by the request.
    6. After success, the structs-config.xml Actions map provides the page redirect value (the Patient Home JavaServer Page).
    7. Java code within the Patient Home JavaServer Page is evaluated to HTML and
    8. sent back as a response to the client user browser for display.

 
Go to Top of this page.
Previous topic this page
Next topic this page
 
Go to Top of this page.
Previous topic this page
Next topic this page

    Set screen MedRec Apps Instrumentation

    Instrumentation should recognize three MedRec applications:

    1. Admin app
    2. Patient app
    3. Physician app, which is access by a Physician client considered an external app that accesses a Controller that is separate from the Admin Controller.

    JMX support by WebLogic Server’s MBeanServer is obtained through Spring’s MBeanServerConnectionFactoryBean, whose byproduct is an MBeanServerConnection established during application deployment and cached for referencing beans.

    The MBeanServerConnectionFactoryBean exposes monitoring, runtime controls, and the active configuration of a specific WebLogic Server instance and WebLogic Server’s Diagnostics Framework. by returning the WebLogic Server’s Runtime MBean Server and the WebLogic Server’s Domain Runtime MBean Server.

    webpage article Instrument the app using JMX for using HP OpenView Transaction Analyzer to identify a server-based transaction bottleneck.

 
Go to Top of this page.
Previous topic this page
Next topic this page

    Set screen Plants by IBM WebSphere

    According to this page, Plants by IBM WebSphere (PBW) is a fictional Web 2.0 sample application that uses Ajax-style Dojo Toolkit JavaScript widgets on the presentation layer with model and control layers.

    YOUTUBE: Web 2.0 FEP Plants by WebSphere Demo shows the screens.
    benchmark app plants by IBM websphere

    The single-page-application sells flowers, plants (fruits and vegetables, trees), and gardening accessories. Its users can view an online catalog, select items, and add (drag-and-drop) items to a cart. When the cart contains items, you can proceed to log in, supply credit card information, and checkout.

    The web service uses a PLANTSDB database via JDBC. When using a clustered server environment, configure the database using type Derby Network Server.

    The grid widget that displays the content of the catalog derives the information by issuing a request to the server using dojo.xhr(Get,Put). The response from the server is sent back in XML format and contains detail information and a Universal Resource Locator (URL) reference to where the image is located.

    The PlantsByWebSphere application includes an RPCAdapter layer used to map traditional J2EE constructs such as Enterprise Java Beans (EJBs), Web services, and POJOs to lightweight constructs such JavaScript Object Notation (JSON) or XML data. The data can easily be consumed and rendered by JavaScript based clients using Dojo. The RPCAdapter is used to map client side GET requests to legacy EJB or Servlet session data. The data is returned as XML and used as input to construct the Dojo widgets within the browser.

    Instructions for Accessing the Samples

 
Go to Top of this page.
Previous topic this page
Next topic this page

    Set screen Oracle Benchmarks

    Oracle ASB 11i Single and RAC systems

    Set screen Trifork

    Trifork, a Danish/San Jose developer of the T4 Enterprise Application Server that competes with Oracle and other J2EE compatible application servers, at one time reported that it created in 3670 lines a J2EE reimplementation of the .NET PetStore 1.1, mimicking the 3758 lines of .NET code by reusing the database layer employing Java Struts as the view layer framework.

    Set screen Macromedia/Cold Fusion MX

    Steve Peterson noticed that Pet Shop provides no graphical information for each type of pet. So his Macromedia's PetMarket benchmark built for usability using Rich Internet Application features of Macromedia Flash MX and Macromedia Flash Remoting consume less bandwidth. The MX benchmark proved that its ColdFusion MX scales under 700 simultaneous users using Windows 2000 AS SP2 on 800 MHz machines!

 
Go to Top of this page.
Previous topic this page
Next topic this page

Set screen "Independent" Benchmarks

    Set screen MacWorld SpeedMark

    MacWorld magazine's SpeedMark uses as baseline the 1.25 GHz Mac Mini to compare the how fast casual and power users can perform 15 "everyday" tasks using 9 real-world applications (including Apple's OS X 10.4.5 "Tiger" operating system).

    Task Application
    1. Startup OS
    2. Duplicate 500MB file
    3. Open Multiple Folders
    4. Create Zip Archive of 1GB folder
    5. Unzip 1GB file archiver
    6. Word Scroll Microsoft Office X with Rosetta to run on Intel-based Macs
    7. Word Search/Replace
    8. Download E-mail
    9. Convert AAC files to MP3 from Hard Driver iTunes 6.0.3
    10. Export to QuickTime for Email iMovie HD
    11. Apply Aged video effect
    12. Import 100 photos from hard driver iPhoto 6.0.1
    13. Multiple Page Loading Test Camino 1.0
    15. Render Cinema 4DXL 9.1
    16. Antalus Botmatch at 1024x768 Max Settings with sound and graphics enabled—Average Frames Per Second Unreal Tournament 2004

    Charles Gaba's System shootouts

    Set screen SPECjAppServer2004

    The $2000 jAppServer2004 multi-tier benchmark application (at v1.05) measures the performance of a single J2EE v1.3 application server running all major J2EE technologies:

    • The web container, including servlets and JSPs
    • The EJB container
    • EJB2.0 Container Managed Persistence
    • JMS and Message Driven Beans
    • Transaction management
    • Database connectivity

    The app simulates Dealer, Manufacturing, Supplier and Corporate domain logical entities.

    Reminder SPECjAppServer2004 result reports use the performance metric of the number of JOPS (jAppServer Operations Per Second) completed during the Measurement Interval. JOPS is composed of the total number of business transactions completed in the Dealer Domain, added to the total number of workorders completed in the Manufacturing Domain, normalized per second.

    The app includes a Supplier Emulator Java Servlet that can run inside any Java enabled web server to emulate the sending and receiving of orders to/from suppliers.

    The app includes a client driver (run on a separate machine) that exercises all parts of the underlying infrastructure that make up the application environment:

    • hardware
    • JVM software
    • database software (usually a single instance)
    • JDBC drivers
    • system network

    However, "SPECjAppServer2004 strives to stress the middle-tier rather than the client tier or the database server tier."

    The Standard Performance Evaluation Corporation (SPEC) is a non-profit corporation formed to establish, maintain and endorse a standardized set of relevant benchmarks that can be applied to the newest generation of high-performance computers. SPEC develops suites of benchmarks and also reviews and publishes submitted results from their member organizations and other benchmark licensees.

    Set screen TPC-App

    The TPC-App benchmark web services app simulates activities of a distributor operating business-to-business transactional application servers operating in a 24x7 environment. TPC-App showcases the performance capabilities of application server systems.

    The workload was published August 2005 to exercise commercially available application server products, messaging products, and databases associated with such environments, which are characterized by:

    • Multiple on-line business sessions
    • Commercially available application environment
    • Use of XML documents and SOAP for data exchange
    • Business to business application logic
    • Distributed transaction management
    • Reliable and durable messaging
    • Dynamic web service response generation with database access and update
    • Simultaneous execution of multiple transaction types that span a breadth of business functions.
    • Databases consisting of many tables with a wide variety of sizes, attributes, and relationships
    • Transaction integrity (ACID properties)

    Reminder TPC-App result reports use the performance metrics of the number of SIPS (Service Interactions Per Second) completed by each application server during the Measurement Interval. "Total" SIPS refers to the entire cluster of servers in the entire configuration (SUT).

    The lone report on 6/21/05 measured 174.9 SIPS per server.

    The distinctiveness of TPC benchmarks is that the SIPS metric is associated with dollar costs, such as the $327.41/SIPS published for an IBM eServer xSeries x366 using a Intel Xeon DP 3.60GHz CPU running MS.NET 1.1 and Microsoft SQL Server 2000 on Windows 2003 Standard Edition.

    Caution! Important Notes:

      "The workload was designed specifically to stress the Application Server. As such, the work to be performed by the database was purposely minimized."

      "TPC-App does not benchmark the logic needed to process or display the presentation layer (for example, HTML) to the clients."

    The non-profit Transaction Processing Performance Council is based in San Francisco, California, USA.

 
Go to Top of this page.
Previous topic this page
Next topic this page

Set screen From Monitoring Software Vendors

    Set screen Sitraka/Quest/Dell Performasure

    The PerformaSure package includes a benchmark J2EE app specifically designed to test speed and scalability. It's now a component of Quest Software's Foglight Application Performance Management (APM) Suite for the J2EE platform.

    Quest was purchased by Dell in 2012.

    xfire Benchmark Factory from from xaffire.com was also purchased by Quest.

    Set screen BAPco

    BABPco (Business Application Performance Corporation) is a non-profit consortium of manufacturers and publishers (including ZDNet and CNet) offers "scientifically" designed workloads (for $1,000 each):

    • SysMark 2012 (64 and 32 bit Lite versions) reflect usage patterns of desktop business users of office productivity, data/financial analysis, system management, media creation, 3D modeling and web development. It is up to $10,000 for global companies.

      Unfortunately, results from previous editions (2007, 2004, at $1,000) cannot be compared against this current version. The 2004 product emulates MS office, Adobe After Effects, Photoshop, Premiere, Macromedia Flas, Dreamweaver, McAfee, Winzip, etc.

    • MobileMark 2012 invokes "real-world" apps to measure battery life and performance simultaneously on Windows tablets. It costs from $1,295 to $14,999 for global enterprises. Previous versions were in 2007, 2005, and 2002.

      The whole suite is from $1,850 to $20,000 for global enterprises.

    • WebMark 2004 - measures resource consumption by two categories of usage against websites created by the benchmark team:
      • "Information Processing" users view a portal, conduct researching, and receive training on the Internet,
      • "Commercial Transaction" users perform "purchasing", "Finance" online brokerage, and "Marketplace" transactions
      Findings include:
      • Increase in CPU speed resulted in proportionate increase in response.
      • Increase in main menory had negligible effect.
      • A 4% improvement is gained by changing from IDE disk at 7200 RPM to Serial ATA disk.
    • TabletMark characterizes the performance of Windows 8 RT tablets. FREE

    Set screen Pressmark

    Pressmark PerformanceTest performs low-level tests to calculate a summary score (such as 1814) used to compare against other Windows PC machines and against a baselines of other machines. In addition to 2D Graphics, 3D Graphics, and CD tests:

    Its Memory test calculates:

    • Allocate Small Block
    • Reach Cached
    • Read Uncached
    • Write
    • Large RAM

    Its Disk test calculates:

    • Sequential Read
    • Sequential Write
    • Rndom Seek + RW

    Its CPU test calculates:

    • Integer Math
    • Floating Point Math
    • Find Prime Numbers
    • SSE
    • Compression
    • Encryption
    • Physics
    • String Sorting

 
Go to Top of this page.
Previous topic this page
Next topic this page

Set screen From Test Tool Software Vendors

    Set screen Advantage On-line Banking from HP Software

    The AOB web app is proprietary to HP and not released to the public.

    Set screen Excilys from Gatlingz

    excilys-bank is a fake e-banking app created for use as the web app under test by open-source web stress tool Gatling-tool.org (named after the name of the first machine gun). It's written in the Scala language DSL (Domain Specific Language). It includes a scenario recorder and results reporting.

    The web app under test is Hosted on Cloud Foundary (Cloudbees.net)

    It can also be downloaded from this Github, the app needs to be compiled (using Maven). To avoid this again, I created an AMI within Amazon.

    100 user IDs are generated with the name userN and passwordN where the N is a number. For example, the first name of user99 is Foo99. The last name is Bar99.

    Excilys Bank accounts

    The currency is euros because the app was developed by a company in France which offers services on Gatling.

    The Debit account 1 transaction list:

    Excilys Bank Debit account

    Transfer | Perform:

    Excilys Bank transfer

    When the transfer is performed, the app throws a Technical Exception:

    Excilys Bank exception

    Set screen Borland

    http://demo.borland.com/

    Set screen nopCommerce

    demo.nopcommerce.com is used as a benchmark because it is free and has sample data,

    Its publicly-available demo site is reset every hour.

    .Net-based system which uses jQuery-UI.

 
Go to Top of this page.
Previous topic this page
Next topic this page

Portions ©Copyright 1996-2014 Wilson Mar. All rights reserved. | Privacy Policy |

Related:

  • Web Development Project Plans
  • Application Development
  • Web Development Tools
  • Clients/Browsers
  • The Java Platform
  • Java Programming
  • VB Programming

  • How I may help

    Send a message with your email client program


    Your rating of this page:
    Low High




    Your first name:

    Your family name:

    Your location (city, country):

    Your Email address: 



      Top of Page Go to top of page

    Thank you!