This is version 10.
It is not the current version, and thus it cannot be edited.
[Back to current version]
[Restore this version]
- Refactor PAR wf into two wfs (data acquisition, data analysis), reorganize par-data-clean step, replace with generic actors (Elementwise: min, max, is.na; UnivariateDistribution: min, max, range, interquartile range, mean, variance, median, stddev, historgram, densityplot, autocorrelation, partial autocorrelation; bivariate: correlation; statistical model fit: lm, glm, gam; filter actors) (Derik)
- Parviez' workflow model for wrapping a simulation using CommandLineActor -- review annotations and resolve issues
- Terr wf -- need to select data based on spatial and temporal bounds
- Parameter sweep example workflow (define reqs of param sweep actor, implement actor after checking existing actors and directors)
- Review user manual section on Command Line Exec actor (section 5.5) (Parviez)
- Example wf pulling data from SensorBase at CENS using WebService actor or DBQuery actor
- Take loggernet out of the loop in the hardware demo (buy logger for testing) (Derik, Eric G)
- New workflow to create derived data for Met data (e.g., wind chill, dewpoint, etc)
- Determine how to archive data
- Layman version (without technical and scientific buzwords) of the description of the goals for each step and workflow(couple of paragraphs) (April 4th; 1 week) (Ilkay + Mark)
- Peter will write up his realization of why this is useful on top of just matlab with a few bullets(April 4th; 1 week) (provenance, chaining what is available, ...) (Peter)
- Conceptual workflow with input/output details and placeholders for different computational components (April 9th; 2 weeks) (Ilkay and Dan)
- Design the schema ad build a relational database for the match up datasets (April 11th; 2 weeks) (Dan and Ilkay)
- Recollection of requirements and comments (April 17th; 3 weeks) (Ilkay and everyone on SST)
- Working version0 (May 15th; 1.5 months - 4 weeks for implementation) (Ilkay and Dan)
- Definition of additional analysis steps (June 1st; 2 months) (Ilkay with input from Peter)
- GHRSST Meeting (June 9-13, 2008; France) presentation and demonstration (June 9th; 2 months) (Peter)
- Develop user interface and data structures for choosing a coverage and subsetting data using that coverage specification; may be a sampling actor and/or search (when: April 30 who: )
- Document OPeNDAP actor, add to tree, remove ODC feature, assign icon, test for release (Nathan (with Kirsten)), (Next week))
- Matlab actor -- prepare for release, get library loading working, remove MatlabExpression version of actor (cretae bug to merge this as a fallback to JNI), make actor use TextArea for script, check documentation, complete test workflows (Daniel, with help from Kirsten, by mid April)
- Change ProvenanceRecorder to use local hsql db, turned on by default?, possibly output to text format on demand (Daniel, by mid April for Kepler release)
- Combine EML and DAP DataSource actors
- Create data location actor (finds data and spits out URL)
- Pass data by-reference in addition to by-value
- DataTurbine actor: PortParam bug, change name, gap filling feature (probably separate actor if doing interpolation), only output data on connected ports, assign icon
- CSV file reader (check name and documentation (tab, comma, csv should all return actor), ignore headers and footers (check the existing R Read.Table actor)
- Add DateTime token type
- Add DataFrame token type
- Modify Command Line Exec actor to trap errors so that workflow can continue after error or exception or core dump (Daniel)
- Remove External Execution actor from actor list, docs, and maybe code base, and/or maybe consolidate COmmandLineExec with External Execution (Daniel)
- Develop common access interface for DataTurbine, SensorBase, and Antelope (Derik, Eric G/CENS,Matt,Ilkay)
- Initiate discussion between CENS/REAP/DataTurbine about commonalities in goals, approach, and interface (Eric G)
|