Blog

Work Completed

  • Updated Building Tutorial wiki page to include sections on the building decision support analyses. Included is up-to-date text describing the analyses and test runs to make sure everything works as written in the tutorial. I opened a MAE-1211 and fixed a discrepancy for what the Multi-Attribute Utility Analysis field name should have been and what it was. I also noticed a few minor out of date parts in the help file. This has been updated.
  • Updated building decision support help text. I noticed a few fields described in text were out of date. This work was also tracked as part of MAE-1211 since they were similar.
  • Added the MAEviz Pipeline Damage Tutorial to the wiki and updated the text and images. This tutorial had originally been done for the Turkish Gas Pipelines so the text was updated with our sample fictitious pipeline dataset for Shelby County.
  • Finished adding the bridge tutorial. Next week I will add sections covering the bridge decision support (NBSR) as a tutorial.

Work Planned

  • Continue adding MAEviz tutorials to the wiki and updating them to make them current with latest version of MAEviz. The MAEviz pipeline tutorial needs to be added to the wiki. Also, the MAEviz decision support section needs to be added to the building tutorial
  • Finish adding the bridge tutorial to the wiki

Comments

This week went as planned.

Work Completed

  • Continued working on adding the bridge damage tutorial to the wiki. Updated screenshots and text to match the current MAEviz version.

Work Planned

  • Work on adding the bridge damage tutorial to the MAEviz wiki

Comments

This week went as planned. I took personal days from the 16th - 19th. I will be back to full time on Monday the 22nd.

Work Completed

  • Follow up contact of Chris Powell about potential NSF proposal, awaiting her response. I will try to arrange a short teleconference to get some ideas on the table so we can get started writing. The due date is July 8, 2009.
  • Replaced some fragility files with updated ones from Can. I also tested the analyses to make sure everything was working properly. I also updated the default sets to point to the new files. This work is tracked as MAE-1210.
  • Finished adding the building damage tutorial and the introductory tutorial parts to the wiki and I updated screenshots and text to make it current. The section on decision support still needs to be added.
  • I started adding the bridge damage tutorial, updated the old screenshots from it and made the text current with MAEviz 3.1.1.

Work Planned

  • Add Building Damage Tutorial and Introductory Tutorial to the wiki. This should make the tutorial materials more accessible to the user from Jordan and any future clients.
  • Add Bridge Damage Tutorial to the wiki.
  • Update Utility Fragilities in MAEviz and update Default Sets
  • Contact Chris Powell about NSF proposal opportunities.

Comments

This week went as planned. I will be taking personal days June 16 - June 19, 2009, returning to work on Monday June 22, 2009.

Work Completed

  • We were contacted by Bill Spencer to help a researcher in Jordan get building data into MAEviz. Most of the tutorials for MAEviz involve data already being in MAEviz so we are writing up some wiki documentation on getting new data into the software. I started this work and will continue web-izing our tutorials into the MAEviz wiki next week.
  • Continued adding more text to the blog entry about JUnit and jMock testing libraries. I have a simple (but complete) example that takes the developer through the process of getting a unit test working with these libraries. Unit Test/jMock Tutorial
  • Continued setting up the unit testing framework that will test parts of the Data Catalog as they are completed. Also, I am following the updates to the wiki about the data catalog as they are made and contributing as I can. I added a section about unit testing and some simple conventions.
  • Contacted Chris Powell of CERI about collaborating on an earth science NSF proposal. She and a colleague, Chris Cramer, are interested in pursuing something so Jong and I will lead this effort.
  • Attended Friday CET, 2 - 3pm meeting via Google Chat.

Work Planned

  • Contact Chris Powell to pursue potential earth science collaboration.
  • Continue Unit testing blog entry with an example using JUnit and jMock.
  • Continue following Data Catalog wiki updates
  • Create tutorial for ingesting building data and fragility mapping data into MAEviz with step-by-step instructions.
  • Web-ize the current MAEviz tutorials by adding them to the wiki.
  • Attend Friday CET Meeting, 2 - 3 pm.

Comments

This week went as planned. Next week I will continue with the data catalog, NSF proposal and MAEviz tutorial web-izing.

Work Completed

  • Attended CET Staff Meeting
  • Downloaded JMock 2.5.1 and added it to an eclipse plugin.
  • Started working with some code that uses JUnit and JMock to learn more about using JMock for the Data Catalog testing framework. I also started a blog entry that gives an introduction to JUnit and JMock and I will add more text next week along with a complete example so others can learn from what I have done and so I can capture lessons learned as I use JMock.
  • Started setting up some test classes to begin building the Data Catalog so that unit tests become an integral part of the development of the Data Catalog and not something that happens after classes are designed. This will also allow the unit tests to evolve with the class.

Work Planned

  • Monday - Holiday
  • Attend CET Staff Meeting Thursday, 3 - 4pm
  • Take a closer look at JMock for unit testing the new Data Catalog. I will be writing a blog entry about what I learn along with a complete example of using JMock for unit testing with Eclipse.
  • Start setting up unit testing framework for the new Data Catalog

Comments

This week went as planned. I expect to continue learning more about JMock and JUnit next week and adding more text to my blog entry.  Also, we will find out whether we have been invited for a full TRANSIMS proposal this week so I might be writing more text for the proposal towards the end of the week.

Introduction


The intent of this blog entry is to track what I learn while working through an example using JUnit 4.3.1 and jMock 2.5.1. This entry is a work in progress and will be updated as I go along. By the end of this, I think you will agree that jMock is awesome and can provide you with a very good tool to mock-up something that might be heavyweight and unit testing in general is absolutely critical to production software.

JUnit


Let's get the introductions out of the way for those who don't know about JUnit. Put simply, JUnit is an open source testing framework used to test source code. It is critical for any production application to include a thorough set of unit tests so that errors can be detected immediately, even before checking new code in. Every class should have a unit test to verify that it behaves as the author expected and to ensure this is the case, a unit test should be written along with the new class so that tests can be set up immediately. Any subsequent changes to the class can be verified by re-running its test before the changes are commited to the repository. If an untested scenario is discovered, then it can be added to the unit test so that the unit test evolves with the class so that future changes that cause the same problem can be detected immediately and fixed.

jMock


jMock is a library that supports test-driven development of Java code with mock objects. Mock objects are useful because they can help you test the interaction between the objects in your program. Here are some bullet points from their website:

The jMock library:

  • makes it quick and easy to define mock objects, so you don't break the rhythm of programming.
  • lets you precisely specify the interactions between your objects, reducing the brittleness of your tests.
  • works well with autocompletion and refactoring of your IDE
  • plugs into your favorite test framework
  • is easy to extend

Now, imagine you need an object that requires an LDAP (lightweight directory access protocol) server to construct. Now, you could start an LDAP server, give it some data, run your test and then tear it down, but that's a lot of work and more closely resembles integration testing. However, a faster alternative would be to use jMock to mock up an object that returns the data you need. Mocking up objects is only the tip of the iceberg. You can also set expectations of objects (e.g. this method is only called once), set the order methods should be called in, test multi-threaded code, etc.

Example


In this example, we will create a mock object of a Sensor with jMock and add it to a SensorNetwork. We will set the expectation that the sensor receive a message from the sensor network. If no message is received, then the test will fail. This example is intended to be simple in order to illustrate the potential of jMock. It is worthwhile to note that we will extend MockObjectTestCase which uses the legacy JUnit 3 classes. The reason for this is that eclipse comes bundled with JUnit 4.3.1 and to use the @RunWith(JMock.class) notation of the latest version of jMock requires JUnit 4.4 or later. To add JUnit 4.4 or later, we would have to delete the JUNIT4 folder from the Eclipse plugins directory to avoid a security exception thrown because Eclipse will load it's JUnit 4 before ours. We want to avoid altering the Eclipse plugins directory, so we will use as much JUnit 4 notation as possible until the next Eclipse update. This example only would require a few minor modifications to run with the latest jMock using JUnit 4.4+ so if you want, you can make the altercation and see the jMock website for the minor changes to our test case.

Setup

  1. Get the plugin org.jmock (2.5.1) from MAEviz SVN (svn+ssh://subversion.ncsa.uiuc.edu/CVS/ncsa-plugins).
  2. JUnit 4.3.1 or later.

That's it for external plugins. Now, we'll need to create a project to contain our class under test and a project to contain the tests. I prefer the following packaging conventions:
ncsa.myexample - eclipse plugin
ncsa.myexample.tests - eclipse fragment with ncsa.myexample as the host plugin

After creating those projects, we'll need to create something to mock up and test. First, we are going to create a simple Sensor interface that can receive a message and a SensorNetwork class that contains some sensors and can publish messages to them.

Sensor.java
// interface class for a sensor

// send the sensor a message
public void receive( String message );
SensorNetwork.java
// contains a network of sensors, simplified with one real method to publish messages

private List<Sensor> sensors;

public void publish( String message)
{
  for(Sensor sensor : getSensors()) {
    sensor.receive( message );
  }
}

public void addSensor(Sensor sensor)
{
  getSensors().add( sensor );
}

public List<Sensor> getSensors()
{
  if(sensors == null) {
    sensors = new LinkedList<Sensor>();
  }
  return sensors;
}

Next, we need to create a test class to test out our sensor. You should create the following class in your test fragment:

SensorNetworkTest.java
public class SensorNetworkTest extends MockObjectTestCase {


  SensorNetwork network;

  @Before
  public void setUp() throws Exception
  {
    network = new SensorNetwork();
  }

  @Test
  public void testOneSensorReceivesMessage() throws Exception
  {
    // Our mock object
    final Sensor sensor = mock( Sensor.class );

    // Add our mock sensor to the network
    network.addSensor( sensor );

    final String message = "Hello World";

    // Define Expectations
    checking( new Expectations() {
      {
        oneOf( sensor ).receive( message );
      }
    } );

    // send a message, without this line the test will fail
    network.publish( message );
  }
}

The last step will be to run this as a unit test, not a JUnit Plug-in Test. If you try to run it as a JUnit Plug-in Test, it will fail saying it cannot find certain classes because it needs JUnit 3.8.2 or 3.9.0 so there are definitely some odd dependency issues where some parts of eclipse seem to depend on JUnit and you get a clash between JUnit 3 and JUnit 4. I have only been able to successfully use JUnit 3 as a Plugin Test; however, you would need to remove the JUnit 4 annotations and run everything as JUnit 3.

Problems Encountered


  1. I get a java.lang.SecurityException when trying to run my unit test.
    • This exception occurs because in the Eclipse plugins folder there is a JUNIT4 folder which contains junit.jar version 4.3.1 and lower in the build we are using org.junit version 4.4, which contains JUnit4ClassRunner, a class that we need. The solution is to delete the folder in the plugins directory so that the latest junit 4 is used. There is no way around this because we need that class runner to use JMock.
  2. My tests pass even though I have set an expectation using JMock that does not pass (e.g. method must be called once and it is not called).
    • This can happen if at the top of the class you have not specified that the test should be run with JMock by using the tag @RunWith(JMock.class)

Links


Links to the libraries used in the above example.

Work Completed

  • Final review of submitted TRANSIMS - TRANSviz proposal using MAEviz/Bard as the base platform
  • Updated more wiki text for Bard v1.0 architecture(NCSA-GIS) documenting the current APIs and adding images and started documenting some of the current Bard v2.0 work that has been completed and called int v2.0 of the architecture. This will eventually include the new Data Catalog that will replace the old Data Catalog currently in MAEviz.

Work Planned

  • Finish TRANSIMS/TRANSviz proposal
  • Continue working on the Bard v1.0 wiki and start v2.0 wiki to keep a well documented page available to increase our web presence.

Comments

This week went as planned. I took a personal day on Friday May 22, 2009.  I expect this coming week I will focus more on documenting the Data Catalog API wiki as Shawn and I continue to flesh out the details with the recent comments from Jim Myers.

Work Completed

  • Wrote text and helped edit the TRANSIMS - MAEviz proposal
  • Wrote text for the Bard v1.0 wiki. MAEviz was runner up for the best open source RCP application this year and the most likely reason was the lack of a great web presence. For Bard, we intend to create a better web presence, starting with a wiki that provides good documentation on the design and technologies it uses.
  • Wrote text for the MAEviz v1.0 wiki and uploaded images for both the old MAEviz (D2k version) and the current RCP based version. The intent was to preserve the past work and what was done as well as documenting the current version of MAEviz and increasing its web presence.
  • Attended Cyberarch meeting and CET meeting via google chat.

Work Planned

  • MAEviz - TRANSIMS proposal is the top priority this week. The pre-proposal is due May 20, 2009.
  • Create a wiki for Bard v1.0
  • Create a wiki for the old version of MAEviz to preserve the work done for the precursor of the current MAEviz. Add screenshots and text. Also, add images for the current MAEviz versions wiki.

Comments

This week went as planned.

Work Completed

  • Helped add details to the Data Catalog API, added an image of what we expect the UIRegistry will look like for the import/export options. Also added text to describe what the ModelDescription, PhysicalModel and LogicalModels will look like and how we can utilize a lot of the Tupelo Bean API. I also discussed several parts of the Data Catalog API with Shawn to better understand how he envisioned the various parts working together.
  • As part of the above, I did some digging around to learn more about both the Tupelo Bean API and the usage of Java Beans.  I went through the tutorial on the tupelo page and read through the Java Beans tutorial on Sun's website.

Work Planned

  • Assist Shawn in fleshing out the details of the new Data Catalog API
  • Learn more about the Tupelo Bean API

Comments

This week went as planned. Next week I will be helping Jong put together a TRANSIM proposal.

Work Completed

  • Created an extension point for the UI wizard pages for setting up a context (e.g. an H2 context) so that as new context types become available in tupelo, we can create a setup page appropriate for that context to initialize it (if it makes sense).  This work was tracked as Bard-5, which had been partially completed for a while.  I added the extension point to make this a cleaner implementation than manually adding wizard pages.
  • Bard-56 - Looked into Eclipse property testers, which can be used to test conditions for manipulating menus.  In the RDF page view, the navigating forward and backward arrows were always visible, regardless of whether navigating forward/backwards was possible.  The property testers check whether it is possible to navigate either direction and makes a suggestion that the menu button should be updated.
  • Bard-87 (ttp://jira.ncsa.uiuc.edu/browse/Bard-87) - numerous UI components lacked any unit testing.  This was addressed and several bugs were found and fixed.  The tupelo shapefile adapter for directly reading shapefiles did not implement all constructors that the other shapefile adapters had and this was addressed with unit tests (red/green/refactor) and was tracked as Bard-82 (ttp://jira.ncsa.uiuc.edu/browse/Bard-82).  ApplicationContextModelFactory test was created to test creating context models.  PredicateImageServiceTest was created to test obtaining predicate images.  A bug was found in the class and fixed.  HistoryTest was created to test navigating forward/backwards under various conditions. ContextPageSetupTest was created to test the aforementioned extension point for creating pages associated with creating new contexts.
  • Read through the wiki pages shawn is creating/editing for the data catalog view
  • Attended 2pm CET meeting via google voice chat.

Work Planned

  • Bard-5 - UI for creating new contexts (e.g. new H2 Context, new Simple File Context)
  • Bard-56 Investigate PropertyTesters and use to determine when menus are available
  • Bard-87 (ttp://jira.ncsa.uiuc.edu/browse/Bard-87) several UI features lack unit tests
  • Bard-82 (ttp://jira.ncsa.uiuc.edu/browse/Bard-82) - table adapter for shapefiles
  • Looked over the data catalog wiki pages that Shawn is working on.

Comments

This week went as planned.

Work Completed

  • Assisted Xin Zong in understanding the fragility mapping format and in debugging his mapping file.  We had several email exchanges discussing this topic.  I might additionally help him with ingesting a building dataset.  I told him if he had issues I can assist.
  • Edited the MAEviz NEES Paper and the final paper was submitted to the conference with highlights
  • Worked on Bard-83, created a PreferenceManager to pass preferences from UI plugins to non-UI plugins.  For example, we are saving context information in a set location.  Using preference pages, I exposed the option to the user to set the location from a UI plugin, but a non-UI plugin controls the saving/restoring of contexts so this information can be obtained from the PreferenceManager.  I also added a PreferenceListener to listen for changes and classes that use this information can update appropriately.
  • Finished and closed MAE-1200, adding liquefaction as an option for facilities.  I also updated an the base task which had a bug where liquefaction was incorrectly being included twice in some special cases.
  • Finished MAE-1205, adding the option to include uncertainty in the facility damage analysis.
  • Bard-82, Updated the TripleTableView to appropriately pass selections to the other views.  After adding the feature set adapter, some parts were not working or were not working efficiently.  Added a ViewFilter to greatly improve the efficiency of displaying highlighted rows.  Rather than changing the data input, I added a view filter that just filtered the view to show the tables being highlighted.
  • Updated all help pages for the MAEviz analyses that were worked on to match the changes made.

Work Planned

  • MAE-1200 - liquefaction for facilities
  • Finish editing MAEviz NEES Paper
  • MAE-1205 - gas facility damage analysis does not have a UI option for hazard uncertainty.
  • Bard-83 - Preference pages for Bard to allow expose functionality control to the users for things like unifiers, contexts, etc.
  • Bard-82 - read shapefile dbf directly into a table

Comments

This week went as planned.

Work Completed

  • Updated text for the NEES MAEviz paper, sent to Terry to the sections he was adding
  • Completed writing for section 2.1.3 of MAE Annaul Report as requested by Amr Elnashai and Bill Spencer
  • Fixed issue MAE-1202.  Can Unen requested that we add Leak Rate of pipes as an output of the pipeline damage analysis in MAEviz.  I also discovered a discrepancy in the break rate equation we were using from the one Can sent.
  • Resolved Bard-73, the Bard application should share a common BeanSession (Jeff's opinion from a meeting with him).  A single bean session is passed around the application now from the class that manages our Contexts.
  • Started Bard-82, which involves reading shapefiles into a table view directly from the shapefile instead of from triples to improve performance.  The shapefiles are stored as blobs in tupelo so we can just request the URI of the blob and read directly.  We want to stream the data into the view, so this will take a little more work to get the appropriate listeners in place to update the view as the data loads in a separate thread.  This is partially completed and will be worked on next week.
  • Attended Friday CET meeting via Google Video Chat

Work Planned

  • Writing for MAE Annual Report Section 2.1.3
  • MAE-1200, add liquefaction damage for gas/water facilities
  • MAE-1202, add Leak Rate to pipeline damage analysis
  • Bard-73, shared Bean Session
  • Bard-82, read shapefile table attributes directly from blob instead of from triples
  • Bard-83, investigate using preference pages to make available and remove custom unifiers for Bard
  • Friday, 2pm, CET Meeting

Comments

This week went as planned.  Next week, MAE-1200 is my top priority.

Work Completed

  • Resolved an issue Liang was having with the MAEviz bridge damage analysis
  • Helped answer questions that Xin Zong had about the MAEviz fragility mapping.  Xin is at the City College of New York working on a joint project with Amr Elnashai.
  • Revised NEES 7th Annual Meeting MAEviz Paper
  • Lead the writing for NEES Paper.  The paper was sent to Terry for his sections to be added
  • Consolidated two similar views in Bard into a single view, see Bard-69
  • Worked on saving and restoring unifiers, see Bard-70.  This w as in conjunction with Bard-73.
  • Started reviewing the Tupelo Bean API, worked on an extension point for Bard Beans Mappings so we can generically generate them and add them to a Bean Session.  See Bard-73

Work Planned

  • Bard-69- Consolidate two Bard Views that have similar functionality into a single view (Triple Table View and Table Selection View)
  • Bard-70- Save and restore unifiers by storing them as beans
  • Bard-73 - Create a single bean session for Bard and create an extension point to define Bean Mappings so they can be loaded dynamically into a single bean session and used throughout the application
  • Lead writing for MAEviz NEES Paper

Comments

This week went as planned.

Work Completed

  • Worked on adding icons to the RDF pages for Rasters and Shapefiles, I added the ability to add Icons to the Form Sections, Form labels and Form links.  I also added icons to the UI plugins for things such as files, folders, date, etc on the RDF pages to help visually cue the user about the type of data being viewed.  The Shapefile Ingester was updated so that it added all file metadata regardless of whether all features were added to the context or the files were just stored as blobs.  This was tracked under Bard-56.
  • I updated the raster ingester to store metadata that the geotools software had such as the raster column size, row size datum, etc.  This was tracked as Bard-45.

Work Planned

  • Bard-56 Formatted RDF pages, add icons, additional metadata and store metadata regardless of ingesting all features
  • Bard-45 Raster Ingester, add more metadata such as boundary, datum, etc

Comments

This week went as planned.  I took personal days for Monday, Wednesday and Friday so it was a short week.

Work Completed

  • I completed and closed Bard-59:  create the necessary UI and history tracking to go forward and backwards when traversing the RDF Page View.  This was completed with help from Nathan.
  • Added an additional task to Bard-56 (create a formatted RDF page for Raster data), which is started and is a work in progress.  Some changes had to be made to the Raster Ingester to add the additional metadata (see Bard-45).  I also continued adding metadata for Feature Datasets and the Schemas and updated the RDF Pages for these types to display the new information.  I spent some time reviewing information about Eclipse Forms since I was unfamiliar with them and that is what the RDF Pages are based on.  In addition to this, Jong found a website for describing GIS data with some already defined metadata elements and a few of its own, called Dublin Core Light 4G.  I added this set of elements to Bard's GIS plugin.
  • I spent 1.5 hours on an ACM Skillsoft course about XML (an introduction course about DTD, XML, etc) on Friday (mostly over lunch) since this seemed appropriate as we move towards adding metadata to Bard/MAEviz

Work Planned

Comments

This week went as planned.  Working on the formatted RDF pages required me to dig into some basics about eclipse forms.  I also continued a course I had started a while back to continue learning more about XML.