eProject is a Sohodojo Research Sponsor, find out more...

Sohodojo and Communities of the Future proudly host...
The Center for Community Collaboration Technologies
Specification Writing for Web-based
Project Planning Software

M2 deliverable: Analysis of Comparable Project Planning/Management Offerings

Copyright (c) 2000 Jim Salmons and Frank Castellucci
All Rights Reserved

Associated project: Specification Writing for Web-based Project Planning Software

Project URL: http://sohodojo.com/techsig/project-planning-project.html

sXc Project detail: http://sourcexchange.com/ProjectDetail?projectID=24

Please note: SourceXchange has closed its doors. Any links to the SourceXchange web site are no longer available. We regret the inconvenience.

Project coordination: Sohodojo

Sponsors: Position open

Sponsors (M1-3): Opendesk.com and Collab.Net

Document status: Version 1.0 - Reviewed and approved by core team, submitted 08/16/2000 for extended team review and consideration as meeting the M2 deliverable.

Item: Milestone 2 Deliverable
Format: Multiple plain text files
File names: sxc24-m2-deliverable.txt - This document referencing...
Section Summary files
Raw Data Collection files


This document, being the Milestone 2 deliverable for CCCT Project #1, consists of this introductory file and a collection of associated files which capture and present the analysis data.

The Comparables Analysis Method

For the purposes of the comparable product/service analysis, our goal was to walk a fine line between a 'ruthless focus' on the scope of this spec -- which is 'general purpose project planning' -- and the needs and interests of the Open Source development community -- which is naturally skewed toward the more specific domain of 'software development project management'.

The Comparables Analysis was performed over the period of August 1-16, 2000. One reviewer from the core team, Frank or Jim, obtained access to the target application or service for hands-on evaluation. (The exception was MS Project 2000 and Project Central which was assessed based on reviewer past experience and the latest information available from the Microsoft Project website.)

Given the tight work-effort budgeted for this analysis, it SHOULD NOT be taken as a rigorous 'product round-up' evaluation such as would be performed by ZD Labs or some other product testing laboratory. Our objective was to understand the range of features, the means of implementation and, most importantly, a sense of the underlying analytic model which gives each offering its 'essential character'.

For more information on the assessment goals, method and data collection topics, see the M1 deliverable.

Part 1 - The Core Team and Volunteer Analyses

Our recommended selection of products and services for the 'comparables analysis' is a mix of web-based services as well as network and single-user applications. The emphasis, however, is toward a selection of currently emerging web-based services rather than the application offerings.

This mix, with its emphasis on web-based services, was selected for a variety of reasons including the following:

  • Our project focus, as stated in the RFP and showcased in the project title, is clearly 'web-based project planning'.

  • Web technologies enable _qualitatively_ different collaboration models than have been explored in the more long-standing, single-user (project manager) application offerings.

  • Although the emerging web services are important to the analysis, the years of development and refinement of project planning software applications marketplace is an important source of insights into the underlying models of the problem space and into associated user interaction models.

Products and Services Analysed by the Core Team

The M1 deliverable identified a large pool of candidate products and services for analysis. The intent in identifying this broad candidate list was to evolve consensus among the project's extended team as to which products and services were of the highest interest to the team and sponsors.

The process of review and approval of the M1 deliverable did not produce any reduction in the analysis candidate list. The core team then decided to roughly prioritize the candidate list to ensure that the assessment done would cover those with the highest potential to contribute insights to the team's subsequent spec writing efforts.

Here is the M1 candidate list of WEB-BASED SERVICES (in alpha order) to be considered during the analysis and what we accomplished:

  • eProject - http://www.eproject.com
    Status: Done by Jim

  • PowerSteering (Cambridge Interactive) - http://www.psteering.com
    Status: Partially performed by Frank. Considerable effort was put into gaining access to this service and assessing its features. The 'trial account' project file was inadvertently deleted by the service provider and not restored in time for publication. Since Frank did not have sufficient access to complete the analysis, this candidate is not included in the final deliverable.

  • SourceForge (VA Linux) - http://sourceforge.net
    Status: Done by Frank.

  • WebProject (Novient) - http://www.wproj.com
    Status: Done by Frank.

  • X-Community - http://www.x-community.com
    Status: Done by Jim.

The following SOFTWARE APPLICATIONS (in alpha order) are to be included in the comparables analysis to be performed by the project's core team (Jim and Frank):

Status of Contributed Product and Service Analyses

No voluntary contributed assessments or experience reports were contributed during the M2 work-effort.

The core team found the data collection outline helpful for gathering feature specs and associated reviewer impressions.

While the core team work-effort is a specific, time and effort-delimited project, we believe this project has the potential to begin a self-organizing and on-going community of interest. The vision for this on-going effort is described in our original project proposal and is embodied in the kick-off of The Center for Community Collaboration Technologies at Sohodojo.

It is our intent to post and publicize the detailed and summary results of this comparables analysis on the project website. By adding 'Add a comment', 'Ask a question' and 'Contribute an Experience Report' features to these on-line pages, we will attempt to grow the interest and participation in this project's Big Picture agenda.

As we work out way out of the traditional Northern Hemisphere Summer 'Dead Zone', we are starting to see increased interest in this project. The project's mailing list has grown to seventeen members without any publicity efforts. With the 'critical mass' of the M2 deliverables on-line, we'll increase our publicity efforts which may result in increased and on-going relevant analysis and experience report data being collected in this domain.

Part 2 - Guide to the Comparables Analysis Documents

Two sets of files contain the assessment data developed during the comparables analysis.

The Raw Data Files

The first set of nine files contain the complete, raw data of the analysis with one file for each product or service evaluated. The evaluated product or service and its associated complete data file is as follows (the offering name is a link to the HTML version of the deliverable's text file):

The Section Summaries

The data collection outline contained sixteen sections. Section 1 was an introductory statement and was not used for data collection and does not have a section summary file. The sixteenth section captures Reviewer Profile information. Since all evaluations were performed by the core team members, this section is not aggregated into a section summary file.

The purpose of the analysis was to refresh and extend the core team's knowledge and understanding of the problem domain of this project in support of the development of the software specification to be produced as subsequent milestones of this project.

As such, the analyses serves both a 'mindset-immersion' function for the core team as well as to develop a resource of data to be mined during the specification development phase. To assist in the mining of this data, we have created a second set of files which aggregate the analysis data across sections of the Comparables Analysis Data Collection outline.

These Summary Section files are as follows:

Comments or questions about this deliverable, in general, or about the products or services assessed during this phase of the project are welcome. Simply send your comment or questions here.

Section Summary files

Raw Data Collection files;

### end of document sxc24-m2-deliverable.txt ###

© 1998-2010 Jim Salmons and Timlynn Babitsky for Sohodojo excepting project web pages and documents which are published under the appropriate Open Source documentation license.
Our Privacy Statement
"War College" of the Small Is Good Business Revolution
Website design and hosting by Sohodojo Business Services,
A Portfolio Life nanocorp

Support Sohodojo, the Entrepreneurial Free Agent and Dejobbed Small Business R&D Lab exploring Open Source technologies to support 'Small is Good' business webs for social/economic development
[ Support Sohodojo ] [ Translate page ]
[ Search site ]

Sohodojo home

About Sohodojo

BIG IDEAS for small business

TechSIG area

CCCT home

Community Collaboration Platform Project

OSS Project Planning Project

LegalSIG area

Nanocorp reading


Donor/Sponsor Information

Go to the Visitor Center

 Go ahead, we can take it... Give us a piece of your mind. Complaint? Irritation? Suggestion?
Tell us, please.