Saturday, November 8, 2008

Defect Tracking Tools

This is a list of defect tracking tools. Both commercial and freeware tools are included. The tools on this list are all available standalone, with the exception of a few that are integrated with a test management system. Tools that are only available as part of a bundled suite of tools, such as a configuration management toolset or a complex CASE tool, are not included. Tools that are better suited as call management tools than defect tracking tools are not included, though some tools that are listed claim to do both well.
Many of the tools on this page do not include a Software Description section. This was done to make is easier when I first set up the page. I am now accepting updates and new entries from vendors that include this section.


+1CR (+1 Software Engineering)
Aardvark (Red Gate Software Ltd.)
Abuky (freeware)
AceProject (Websystems Inc.)
AdminiTrack (AdminiTrack, Inc.)
Advanced Defect Tracking (Borderwave Software)
Alcea Fast BugTrack (Alcea Technologies Ltd.)
AllChange (Intasoft)
AQdevTeam (AutomatedQA Corp.)
Atlassian JIRA (Atlassian Software Systems)
BitDesk (PTLogica)
BMC Remedy Quality Management (BMC Software, Inc.)
BridgeTrak Suite (Kemma Software)
Bug Trail
Bug-Track.com
BugAware (Jackal Software Pty Ltd)
BugBase 2000 (Threerock Software)
BugBox (BugBox)
Bugcentral.com (Bugcentral Inc.)
Bug/Defect Tracking Expert (Applied Innovation Management, Inc.)
Buggit (freeware)
Buggy (Novosys EDV GmbH)
BugHost (Active-X.COM)
BugLister (Hajo Kirchhoff)
BugMonitor.com(BugMonitor.com, Inc.)
BugRat (freeware)
BugStation (Bugopolis LLC)
BUGtrack (ForeSoft Corporation)
Bugtrack (freeware)
Bug Tracker Server (Avensoft)
Bug Tracker Software (Bug Tracker Software)
BugUP
Bugzero (WEBsina)
Bugzilla (freeware)
Census Bug Tracking and Defect Tracking (Metaquest)
Change Commander (Lightspeed Software)
ClearDDTS (IBM Rational)
ClearQuest (IBM Rational)
CustomerFirst (Repository Technologies, Inc.)
Debian Bug Tracking System (freeware)
Defect Agent (Inborne Technology Corporation)
Defect Manager (Tiera Software, Inc)
Defect Tracker (New Fire)
defectX (defectX)
Deskzilla
DevTrack (TechExcel, Inc)
d-Tracker (Empirix)
elementool (elementool Inc.)
eQRP (Amadeus International Inc.)
ExtraView (Sesame Technology)
fixx
Flats Helpdesk (WarrinerWare)
FMAS (stag software private limited)
FogBUGZ (Fog Creek Software)
GNATS (freeware)
GRAN PM (GRAN Ltd.)
Helis (freeware)
icTracker (IC Soft, Inc.)
inControl (stag software private limited)
IOS/Track (Interobject Systems)
IssueNet Intercept
IssueView (IssueView.Com)
ITracker(Cowsultants.com)
JitterBug (freeware)
JTrac
LegendSoft SPoTS (LegendSoft Inc.)
Mantis (freeware)
McCabe CM - TRUEtrack (McCabe Software, Inc.)
OfficeClip Defect Tracker (OfficeClip. LLC)
OnTime Defect Tracker
Ozibug (Tortuga Technologies)
PloneCollectorNG (ZOPYX Software development and consulting Andreas Jung)
Problem Reporting System (Testmasters, Inc)
ProblemTracker (NetResults)
ProjectLocker (One Percent Software)
ProjectPortal (Most Media)
PR-Tracker (Softwise Company)
QAS.PTAR (Problem Tracking and Reporting)
QuickBugs (Excel Software)
RADAR (Cosmonet Solutions)
Razor/PT (Visible Systems Corporation)
RMTrack (RMTrack Issue Tracking Solutions Inc.)
Roundup (freeware)
Scarab (freeware)
SilkCentral Issue Manager (Borland)
SourceAction
SourceCast(CollabNet, Inc. )
Support Tracker
SWBTracker (Software with Brains Inc.)
Squish (Information Management Services, Inc.)
T-Plan Incident Manager (T-Plan)
TeamTrack (TeamShare, Inc.)
Telelogic Change (Telelogic AB)
TestTrack Pro (Seapine Software)
Trac (Edgewall Software)
Trackem (Pikon Innovations)
Tracker (freeware)
TrackStudio Enterprise (TrackStudio, Ltd)
TrackWeb Defects (Soffront)
Trackgear (LogiGear)
TrackRecord (Compuware)
Trackwise (Sparta Systems)
Visual Intercept (Elsinore Technologies)
vManage
WebPTS (Zambit Technologies, Inc.)
yKAP - Your Kind Attention Please (DCom Solutions) ZeroDefect (ProStyle Software Inc.)

Friday, November 7, 2008

Software Test Planning



This shows the flow of deliverables among major participants (the stick figures in the use case diagram above).


The darker vertical lines illustrate the principal exchanges of artifacts of information — the teamwork necessary among participants.

Each artifact's color identify one of the 4 metalayers (packages) of abstraction defined by the UML 2.0 Test Profile (U2TP) standard:

1. Test Architecture defining concepts related to test structure and test configuration (containing the relationship of elements involved in a test project)
2. Test Behaviors defining concepts related to the dynamic aspects of test procedures — structural (static) Model -> Test Execution Directives (The interface for testing)
3. Test Data (the structures and meaning of values to be processed in a test)
4. Test Time defining concepts for a time quantified definition of test procedures (constraints and time observation for test execution)

The word "Profile" in U2TP means that it conforms to UML standards.

The seven basic types of testing as defined by SourceLab's "CERT7" are:

Type of Testing


1. Unit Testing
2. Functional Testing
3. Security Testing
4. Stress Testing
5. ScalabilityTesting
6. Reliability Testing
7. Integration Testing

Testing Terminologies

Mercury Interactive's Quality Center (formerly TestDirector) product organizes each requirement for testing as a test subject for each AUT under a Test Plan Tree hierarchy. Both manual and automated scripts can be specified in TestDirector. Each test script selected from the tree becomes a Test Step in a Test Set actually executed by TestDirector.

In Rational's TestManager, a test plan contains test cases organized within test case folders. These are read by Rational's ClearQuest defect tracking system.

Why Bother with UML Test Profiles?

o You can describe a system precisely using UML. That's the reason why UML was invented. The visual representation of test artifacts aims for common and thus hopefully unambiguous intrepretation of test designs.

o UML is the language (lingua franca) "spoken" by "professional" system architects and developers. Testers need to understand the Model Drive Architectures (MDA) that they design and build.

o The UML standard Test Profiles includes specification of plain-text XML which (at this point, theoretically) enables tool independent interchange of test profile information.

o Soon, test tools will require testers to augment UML created by architects to specify testing at a higher level of abstraction instead of crafting scripts as automation testers do now. Testers will specify executable UML action semantics which are automatically compiled into platform-specific test components used to conduct testing.

Test Outputs
The purpose of testing is to obtain information needed to make decisions about a System Under test (SUT).

Test Logs
With UML: A Test Log is an interaction resulting from the execution of a test case. It represents (remembers) the different messages exchanged between test components and the SUT and/or the states of the test components involved.

A log is associated with verdicts representing the adherence of the SUT to the test objective of the associated test case.

The names of test log files usually differ by vendor:
o Logs output by the testing tool:
+ WinRunner, test Logs
+ LoadRunner output.txt and run logs.
o Logs output by scripts within test tools.
o Java JVM verbose logs
o Windows OS Application logs and Security logs.
o An application's stdout and stderr files.

Test Results
Test Results are Key Measures in Test
o Number of
o Time needed to run script

Work Load Analysis Model
See Performance Testing using LoadRunner

Test Evaluation Summary
Performance Reports and Enhancement Requests, Defect Reports, or something.

Test Plan Sections
Test Plan

Test Interface Specs
With U2TP, a test suite has a constraint: it must contain exactly one property realizing the
Arbiter interface


A structured classifier acting as a grouping mechanism for a set of test cases. The composite structure of a test suite is referred to as test configuration. The classifier behavior of a test suite is used for test control.

Test Environment Configuration

Test Automatation Architecture

Test Classes

Test Scripts


Starting from the upper right corner of this diagram from the UML 2.0 Test standard document:
The System Under Test (SUT)
The system under test (SUT) is a part and is the system, subsystem, or component being tested. An SUT can consist of several objects.

In UML testing profiles, the system under test (SUT) is not specified as part of the test model, but as a <> which the test architecture package imports the complete design (UML) model of the SUT in order to get the right to access the elements to be tested.

The SUT is exercised via its public interface operations and signals by the test components.

It is assumed that no further information can be obtained from the SUT as it is a black-box.


Thursday, November 6, 2008

Software Testing Release Life Cycle

Software Testing Life Cycle consist of six (generic) phases: 1) Planning, 2) Analysis, 3) Design, 4) Construction, 5) Testing Cycles, 6) Final Testing and Implementation and 7) Post Implementation.

1. Planning ( Product Definition Phase)

1.1. High Level Test Plan, (includes multiple test cycles)
1.2. Quality Assurance Plan (Quality goals, Beta criteria, etc ..)
1.3. Identify when reviews will be held.
1.4. Problem Reporting Procedures
1.5. Identify Problem Classification.
1.6. Identify Acceptance Criteria - for QA and Users.
1.7. Identify application testing databases
1.8. Identify measurement criteria, i.e. defect quantities/severity level and defect origin (to name a few).
1.9. Identify metrics for the project
1.10. Begin overall testing project schedule (time, resources etc.)
1.11. Requisite: Review Product Definition Document
1.11.1. QA input to document as part of the Process Improvement Project
1.11.2. Help determine scope issues based on Features of the Product
1.11.3. 5 - 10 hours / month approximately

1.12. Plan to manage all test cases in a database, both manual or automated.

2. Analysis ( External Document Phase)

2.1. Develop Functional validation matrix based on Business Requirements.
2.2. Develop Test Case format - time estimates and priority assignments.
2.3. Develop Test Cycles matrices and time lines
2.4. Begin writing Test Cases based on Functional Validation matrix
2.5. Map baseline data to test cases to business requirements
2.6. Identify test cases to automate.
2.7. Automation team begin to setup variable files and high level scripts in AutoTester.
2.8. Setup TRACK and AutoAdviser for tracking components of automated system.
2.9. Define area for Stress and Performance testing.
2.10. Begin development of Baseline Database as per test case data requirements.
2.11. Define procedures for Baseline Data maintenance, i.e. backup, restore, validate.
2.12. Begin planning the number of test cycles required for the project, and Regression Testing.
2.13. Begin review of documentation, i.e. Functional Design, Business Requirements, Product Specifications, Product Externals etc..
2.14. Review test environments and lab, both Front End and Back End.
2.15. Prepare for using McCabe tool to support development in white box testing and code complexity analysis.
2.16. Setup Requite and start inputting documents.
2.17. Requisite: Review Externals Document
2.17.1. QA input to document as part of the Process Improvement Project
2.17.2. Start to write test cases from Action Response Pair Groups
2.17.3. Start to develop metrics based on estimated number of test cases, time to execute each case and if it is “automatable” .
2.17.4. Define baseline data for each test case
2.17.5. 25 hours / month approximately

3. Design (Architecture Document Phase)

3.1. Revise Test Plan based on changes.
3.2. Revise Test Cycle matrices and timelines
3.3. Verify that Test Plan and cases are in a database or Requisite.
3.4. Revise Functional Matrix
3.5. Continual to write out test cases and add new ones based on changes.
3.6. Develop Risk Assessment Criteria
3.7. Formalize details for automated testing and multi-user testing.
3.8. Select set of test cases to automate and begin scripting them.
3.9. Formalize detail for Stress and Performance testing
3.10. Finalize test cycles. (number of test case per cycle based on time estimates per test case and priority.)
3.11. Finalize the Test Plan
3.12. (Estimate resources to support development in unit testing)
3.13. Requisite: Review Architecture Document
3.13.1. QA input to document as part of the Process Improvement Project
3.13.2. Actual components or modules that development will code.
3.13.3. Unit testing standard defined here, pass/fail criteria, etc.
3.13.4. Unit testing reports, what they will look like, for both white and black box testing including input/outputs and all decision points.
3.13.5. List of modules that will be unit tested.

2. Construction (Unit Testing Phase)

2.1. Complete all plans
2.2. Complete Test Cycle matrices and timelines
2.3. Complete all test cases. (manual)
2.4. Complete AutoTester scripting of first set of automated test cases.
2.5. Complete plans for Stress and Performance testing
2.6. Begin Stress and Performance testing
2.7. McCabe tool support - supply metrics
2.8. Test the automated testing system and fix bugs.
2.9. (Support development in unit testing)
2.10. Run QA Acceptance test suite to certify software is ready to turn over to QA.

3. Test Cycle(s) / Bug Fixes (Re-Testing/System Testing Phase)

3.1. Test Cycle 1, run first set of test cases (front and back end)
3.2. Report bugs
3.3. Bug Verification - ongoing activity
3.4. Revise test cases as required
3.5. Add test cases as required
3.6. Test Cycle II
3.7. Test Cycle III

4. Final Testing and Implementation (Code Freeze Phase)

4.1. Execution of all front end test cases - manual and automated.
4.2. Execution of all back end test cases - manual and automated.
4.3. Execute all Stress and Performance tests.
4.4. Provide on-going defect tracking metrics.
4.5. Provide on-going complexity and design metrics.
4.6. Update estimates for test cases and test plans.
4.7. Document test cycles, regression testing, and update accordingly.

5. Post Implementation

5.1. Post implementation evaluation meeting to review entire project. (lessons learned)
5.2. Prepare final Defect Report and associated metrics.
5.3. Identify strategies to prevent similar problems in future project.
5.4. Create plan with goals and milestone how to improve processes.
5.5. McCabe tools - produce final reports and analysis.
5.6. Automation team - 1) Review test cases to evaluate other cases to be automated for regression testing, 2) Clean up automated test cases and variables, and 3) Review process of integrating results from automated testing in with results from manual testing.
5.7. Test Lab and testing environment - clean up test environment, tag and archive tests and data for that release, restore test machines to baseline, and etc. ..


Wednesday, November 5, 2008

QA Process and Offering

In todays business environment, compressed development schedules are a commonplace to meet time-to-market deadlines. Implementation of software risk management through all stages of a development cycle is essential for successful software product/application development. Our customized software risk management services encompass project, product and process related risks. We are a quality driven enterprise working with various quality tools staring from Mercury to others that are used on a regular basis.

Our software risk management framework includes:

• Development lifecycle study.
• Risk identification.
• Risk analysis.
• Risk mitigation strategy & Implementation.
• Risk monitoring.

Based on this model, we offer:

• Software Risk Management & Analysis.
• Full Lifecycle Testing.
• Black Box Testing.
• White Box Testing.
• Test Automation.
• Security Testing.

A Peek into our QA Process

 Testing Life Cycle

Full Lifecycle Testing

Early detection of software defects can result in huge time and cost savings for any enterprise. Software testing at every stage of software development life cycle can prevent defect occurrence and ensure reliable software development for businesses. we provide full life cycle testing services for product and application development life cycles. Our services include:

• Unit Testing.
• Integration Testing.
• System Testing.
• Regression Testing.
• Acceptance Testing.
• Test Management.

Black Box Testing

Verification and validation of software products and applications for functional and non-functional requirements forms the basis of Black-Box testing methodology. Software products and applications are tested for functionality, performance, platform & data compatibility and ease of use through this methodology. our black box testing methodology covers the following:

• Functionality Testing.
• GUI Testing.
• Performance Testing.
• Stress and Load Testing.
• Compatibility Testing.
• Installation Testing.

White Box Testing

White box testing improves the testability of a software product by making testing more effective and efficient. Many modules and subsystems of a software project require testing in isolation. White box testing achieves this by testing every single line of code of individual components of software products and applications. our provides white box testing services for applications developed using any language. Our services include:

• Code Coverage.
• Path Coverage.
• Code Analysis.

Test Automation

Software testing is often perceived as a bottleneck operation in the software delivery process. Automation testing solves this problem by drastically reducing testing cycle times. Automation testing requires thorough study of development process and software solution architecture, selection of suitable test automation tools and implementation of test automation. our test automation services encompass white-box as well as black box testing methodologies. Our test automation framework includes:

• Study of Development Process and Solution Architecture.
• Automation Test Strategy.
• Selection of Appropriate Testing Tools.
• Test Automation Implementation.
• Siri also develops customized test automation tools for specific client
requirements.

Security Testing

In today’s networked business environment, users access many enterprise applications over public networks. These applications contain confidential business data, which needs to be protected from unauthorized and unauthenticated access. Many enterprise applications and data are vulnerable to external and internal threats. we provides security-testing services for software products, applications and enterprise data. Our security testing services include:

• Application Vulnerability Assessment.
• Risk Identification.

Our application software testing and quality assurance services are designed for accelerated time-to-market and reduced costs.


Software Testing - Bug Life Cycle

Tuesday, November 4, 2008

Design

• There are two ways of constructing a software design. One way is to make it so simple that there are obviously no deficiencies. And the other way is to make it so complicated that there are no obvious deficiencies. (Charles Hoare)

• Imitating paper on a computer screen is like tearing the wings off a 747 and using it as a bus on the highway. (Ted Nelson)

• How would a car function if it were designed like a computer? Occasionally, executing a maneuver would cause your car to stop and fail and you would have to re-install the engine, and the airbag system would say, "Are you sure?" before going off. (Katie Hafner)

• A common mistake that people make when trying to design something completely foolproof was to underestimate the ingenuity of complete fools. (Douglas Adams)

• You can have any combination of features the Air Ministry desires, so long as you do not also require that the resulting airplane fly. (Willy Messerschmidt)

• More people have ascended bodily into heaven than have shipped great software on time. (Jim McCarthy)

• In the beginning we must simplify the subject, thus unavoidably falsifying it, and later we must sophisticate away the falsely simple beginning. (Maimonides)

• Walking on water and developing software from a specification are easy if both are frozen. (Edward V Berard)

• One of the great enemies of design is when systems or objects become more complex than a person - or even a team of people - can keep in their heads. This is why software is generally beneath contempt. (Bran Ferren)

• A complex system that works is invariably found to have evolved from a simple system that worked. (John Gall)

• Invariably, if something is so complex that it requires the addition of multiple preferences or customization choices, it is probably too complex to use. (Don Norman)

• There comes a time in the history of any project when it becomes necessary to shoot the engineers and begin production. (MacUser in 1990)

• The multiple stupidities of even the latest designs, such as Microsoft’s Windows 2000 or Apple’s OS X, show either an unjustifiable ignorance of or a near-criminal avoidance of what we do know [about existing engineering methods for designing human-computer interfaces]. (Jef Raskin)

• Designers talk and think a lot like science fiction writers do, except in a much less melodramatic and histrionic way. (Bruce Sterling)

• Recognizing the need is the primary condition for design. (Charles Eames)

• All really first class designers are both artists, engineers, and men of a powerful and intolerant temper, quick to resist the least modification of the plans, energetic in fighting the least infringement upon what they regard as their own sphere of action. (Nevil Shute)

• You can only put as much intelligence in a system as was in the design engineer to begin with. (Peter Orme)

• Design adds value faster than it adds cost. (Joel Spolsky)

• At a place like IBM, there's an infinite world of products that you can create. But, too often, management would say, "Great, you big-idea guys, go go go." But then they give all the money to the people who control the revenue streams, the people with the overhead projectors and PowerPoint slides. (Ted Selker)

• When one has no character one has to apply a method. (Albert Camus)

• The most powerful designs are always the result of a continuous process of simplification and refinement. (Kevin Mullet)

• If architects worked on the same principle [as software engineering], most buildings would end up looking like the Leaning Tower of Pisa. (David Crocker)

• There is no such thing as a boring project. There are only boring executions. (Irene Etzkorn)

• No, I'm not interested in developing a powerful brain. All I'm after is just a mediocre brain, something like the President of the American Telephone and Telegraph Company. (Alan Turing)

• A creative man is motivated by the desire to achieve, not by the desire to beat others. (Ayn Rand)

• Man has such a predilection for systems and abstract deductions that he is ready to distort the truth intentionally, he is ready to deny the evidence of his senses only to justify his logic. (Fyodor Dostoevsky)

• Building large applications is still really difficult. Making them serve an organisation well for many years is almost impossible. (Malcolm P Atkinson)

• We want to make a machine that will be proud of us. (Danny Hillis)

• If our designs are failing due to the constant rain of changing requirements, it is our designs that are at fault. We must somehow find a way to make our designs resilient to such changes and protect them from rotting. (Robert C Martin)

• If you cannot grok the overall structure of a program while taking a shower, you are not ready to code it. (Richard Pattis)

• Designers must do two seemingly contradictory things at the same time: They must design for perfection, and they must design as though errors are inevitable. And they must do the second without compromising the first. (Bob Colwell)

• The two main design principles of the NeXT machine appear to be revenge and spite. (Don Lancaster)

• When I am working on a problem, I never think about beauty. I think only of how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong. (R Buckminster Fuller)

• Tools that are meant to support serious, concentrated effort, where the task is well specified and the approach relatively well understood are best served by designs that emphasize function and minimize irrelevancies. (Don Norman)

• It's OK to figure out murder mysteries, but you shouldn't need to figure out code. You should be able to read it. (Steve McConnell)

• The edge of chaos is the constantly shifting battle zone between stagnation and anarchy, the one place where a complex system can be spontaneous, adaptive, and alive. (M Mitchell Waldrop)

• Technical skill is mastery of complexity, while creativity is mastery of simplicity. (E Christopher Zeeman)

• Very often, people confuse simple with simplistic. The nuance is lost on most. (Clement Mok)

• Those who admire the massive, rigid bone structures of dinosaurs should remember that jellyfish still enjoy their very secure ecological niche. (Beau Sheil)

• Design and programming are human activities; forget that and all is lost. (Bjarne Stroustrup)

• It occurred to me this morning that many system design flaws can be traced to unwarrantedly anthropomorphizing the user. (Steven Maker)

• Creativity involves breaking out of established patterns in order to look at things in a different way. (Edward de Bono)

• If programs had multiple ways to think, then they wouldn’t so often get stuck –- because they could change their points of view. (Marvin Minsky)

• The really good idea is always traceable back quite a long way, often to a not very good idea which sparked off another idea that was only slightly better, which somebody else misunderstood in such a way that they then said something which was really rather interesting. (John Cleese)

• It requires a very unusual mind to undertake the analysis of the obvious. (Alfred North Whitehead)

• Great design will not sell an inferior product, but it will enable a great product to achieve its maximum potential. (Thomas Watson Jr)

• Architect: Someone who knows the difference between that which could be done and that which should be done. (Larry McVoy)

• Anyone who conducts an argument by appealing to authority is not using his intelligence; he is just using his memory. (Leonardo da Vinci)

• Much of the Web is like an anthill built by ants on LSD. (Jakob Nielsen)

• The difference between a great design and a lousy one is in the meshing of the thousand details that either fit or don't, and the spirit of the passionate intellect that has tied them together, or tried. (Ted Nelson)

• There's a better way to do it. Find it. (Thomas Edison)

• How good the design is doesn't matter near as much as whether the design is getting better or worse. If it is getting better, day by day, I can live with it forever. If it is getting worse, I will die. (Kent Beck)

• In a room full of top software designers, if any two of them agree, that's a majority. (Bill Curtis)

• Form follows function - that has been misunderstood. Form and function should be one, joined in a spiritual union. (Frank Lloyd Wright)

• Out of intense complexities intense simplicities emerge. (Winston Churchil)

• The mathematical sciences particularly exhibit order, symmetry, and limitation; and these are the greatest forms of the beautiful. (Aristotle)

• An application that does something really great that people really want to do can be pathetically unusable, and it will still be a hit. And an application can be the easiest thing in the world to use, but if it doesn't do anything anybody wants, it will flop. (Joel Spolsky)

• Markets historically evolve past commoditization to value style and special features. (Nicholas Carr)

• Good engineering doesn't consist of random acts of heroism. (Harry Robinson)

• Quality isn't something you lay on top of subjects and objects like tinsel on a Christmas tree. (Robert Pirsig)

• Absolute certainty about the fail-proofness of a design can never be attained, for we can never be certain that we have been exhaustive in asking questions about its future. (Henry Petroski)

• No amount of genius can overcome a preoccupation with detail. (Marion Levy)

• A specification, design, procedure, or test plan that will not fit on one page of 8.5-by-11 inch paper cannot be understood. (Mark Ardis)

• Everyone designs who devises courses of action aimed at changing existing situations into preferred ones. (Herbert Simon)

• Crash programs fail because they are based on the theory that, with nine women pregnant, you can get a baby a month. (Wernher von Braun)

• You can't just ask customers what they want and then try to give that to them. By the time you get it built, they'll want something new. (Steve Jobs)

• One of the things that tools can do is to help bad designers create ghastly designs much more quickly than they ever could in the past. (Grady Booch)

• The understanding that underlies the right decision grows out of the clash and conflict of opinions and out of the serious consideration of competing alternatives. (Peter Drucker)

• Plan to throw one away. You will do that, anyway. Your only choice is whether to try to sell the throwaway to customers. (Frederick Brooks)

• If you plan to throw one away, you will throw away two. (Craig Zerouni)

• Each pattern describes a problem which occurs over and over again in our environment, and then describes the core of the solution to that problem, in such a way that you can use this solution a million times over, without ever doing it the same way twice. (Christopher Alexander)

• Things intended to be used under stressful situations require a lot more care, with much more attention to detail. (Don Norman)

• I'm not schooled in the science of human factors, but I suspect surprise is not an element of a robust user interface. (Chip Rosenthal)

• A well-designed and humane interface does not need to be split into beginner and expert subsystems. (Jef Raskin)

• Never, ever, ever let systems-level engineers do human interaction design unless they have displayed a proven secondary talent in that area. Their opinion of what represents good human-computer interaction tends to be a bit off-track. (Bruce Tognazzini)

• Graphic design will save the world right after rock and roll does. (David Carson)

Monday, November 3, 2008

Computer's Law

• Amdahl's Law: The speed-up achievable on a parallel computer can be significantly limited by the existence of a small fraction of inherently sequential code which cannot be parallelised. (Gene Amdahl)

• Augustine's Second Law of Socioscience: For every scientific (or engineering) action, there is an equal and opposite social reaction. (Norman Augustine)

• Benford's Law: Passion is inversely proportional to the amount of real information available. (Gregory Benford)

• Brooks' Law: Adding manpower to a late software project makes it later. (Frederick P Brooks Jr)

• Church-Turing Thesis: Every function which would naturally be regarded as computable can be computed by the universal Turing machine.

• Clarke's First Law: When a distinguished but elderly scientist states that something is possible he is almost certainly right. When he states that something is impossible, he is very probably wrong. (Arthur C Clarke)

• Clarke's Second Law: The only way of discovering the limits of the possible is to venture a little way past them into the impossible. (Arthur C Clarke)

• Clarke's Third Law: Any sufficiently advanced technology is indistinguishable from magic. (Arthur C Clarke)

• Conway's Law: If you have four groups working on a compiler, you'll get a 4-pass compiler. (Melvin Conway)

• Cope's Law: There is a general tendency toward size increase in evolution. (Edward Drinker Cope)

• Dilbert Principle: The most ineffective workers are systematically moved to the place where they can do the least damage: management. (Scott Adams)

• Deutsch's Seven Fallacies of Distributed Computing: Reliable delivery; Zero latency; Infinite bandwidth; Secure transmissions; Stable topology; Single adminstrator; Zero cost. (Peter Deutsch)

• Ellison's Law: The userbase for strong cryptography declines by half with every additional keystroke or mouseclick required to make it work. (Carl Ellison)

• Ellison's Law: The two most common elements in the universe are hydrogen and stupidity. (Harlan Ellison)

• Ellison's Law: Once the business data have been centralized and integrated, the value of the database is greater than the sum of the preexisting parts. (Larry Ellison)

• Finagle's Law: Anything that can go wrong, will. (?Larry Niven)

• Fisher's Fundamental Theorem: The more highly adapted an organism becomes, the less adaptable it is to any new change. (R A Fisher)

• Fitts's Law: The movement time required for tapping operations is a linear function of the log of the ratio of the distance to the target divided by width of the target. (Paul Fitts)

• Flon's axiom: There does not now, nor will there ever, exist a programming language in which it is the least bit hard to write bad programs. (Lawrence Flon)

• Gilder's Law: Bandwidth grows at least three times faster than computer power. (George Gilder)

• Godwin's Law: As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one. (Mike Godwin)

• Grosch's Law: The cost of computing systems increases as the square root of the computational power of the systems. (Herbert Grosch)

• Grove's Law: Telecommunications bandwidth doubles every century. (Andy Grove)

• Hanlon's Law: Never attribute to malice that which can be adequately explained by stupidity. (?Robert Heinlein)

• Hartree's Law: Whatever the state of a project, the time a project-leader will estimate for completition is constant. (Douglas Hartree)

• Heisenbug Uncertainty Principle: Most production software bugs are soft: they go away when you look at them. (Jim Gray)

• Hick's Law: The time to choose between a number of alternative targets is a function of the number of targets and is related logarithmically. (W E Hick)

• Hoare's Law: Inside every large problem is a small problem struggling to get out. (Charles Hoare)

• Hofstadter's Law: It always takes longer than you think, even when you take Hofstadter's Law into account. (Douglas Hofstadter)

• Jakob's Law of the Internet User Experience: Users spend most of their time on other websites. (Jakob Nielsen)

• Joy's Law: Computing power of the fastest microprocessors, measured in MIPS, increases exponentially in time. (Bill Joy)

• Kerckhoff's Principle: Security resides solely in the key. (Auguste Kerckhoff)

• Kurzweil's Law of Accelerating Returns: As order exponentially increases, time exponentially speeds up (that is, the time interval between salient events grows shorter as time passes). (Ray Kurzweil)

• Law of the Conservation of Catastrophe: The solutions to one crisis pave the way for some equal or greater future disaster. (William McNeill)

• Law of False Alerts: As the rate of erroneous alerts increases, operator reliance, or belief, in subsequent warnings decreases. (George Spafford)

• Lister's Law: People under time pressure don't think faster. (Timothy Lister)

• Lloyd's Hypothesis: Everything that's worth understanding about a complex system, can be understood in terms of how it processes information. (Seth Lloyd)

• Metcalfe's Law: The value of a network grows as the square of the number of its users. (Robert Metcalfe)

• Moore's Law: Transistor die sizes are cut in half every 24 months. Therefore, both the number of transistors on a chip and the speed of each transistor double every 18 (or 12 or 24) months. (Gordon Moore)

• Murphy's Law: If there are two or more ways to do something, and one of those ways can result in a catastrophe, then someone will do it. (Edward A Murphy)

• Nathan's First Law: Software is a gas; it expands to fill its container. (Nathan Myhrvold)

• Ninety-ninety Law: The first 90% of the code accounts for the first 90% of the development time. The remaining 10% of the code accounts for the other 90% of the development time. (Tom Cargill)

• Occam's Razor: The explanation requiring the fewest assumptions is most likely to be correct. (William of Occam)

• Osborn's Law: Variables won't; constants aren't. (Don Osborn)

• Parkinson's Law: Work expands so as to fill the time available for its completion. (C Northcote Parkinson)

• Pareto Principle: 20% of the people own 80% of the country's assets. (Corollary: 20% of the effort generates 80% of the results.) (Vilfredo Pareto)

• Pesticide Paradox: Every method you use to prevent or find bugs leaves a residue of subtler bugs against which those methods are ineffectual. (Bruce Beizer)

• Peter Principle: In a hierarchy, every employee tends to rise to his level of incompetence. (Laurence J Peter)

• Red Queen Principle: For an evolutionary system, continuing development is needed just in order to maintain its fitness relative to the system it is co-evolving with. (Leigh van Valen)

• Rock's Law: The cost of semiconductor fabrication equipment doubles every four years. (Arthur Rock)

• Rule of 1950: The probability that automated decisions systems will be adopted is approximately one divided by one plus the number of individuals involved in the approval process who were born in 1950 or before squared. (Frank Demmler)

• Sixty-sixty Law: Sixty percent of software’s dollar is spent on maintenance, and sixty percent of that maintenance is enhancement. (Robert Glass)

• Spector's Law: The time it takes your favorite application to complete a given task doubles with each new revision. (Lincoln Spector)

• Sturgeon's Law: Ninety percent of everything is crap. (Theodore Sturgeon)

• Tesler's Law of Conservation of Complexity: You cannot reduce the complexity of a given task beyond a certain point. Once you've reached that point, you can only shift the burden around. (Larry Tesler)

• Tesler's Theorem: Artificial Intelligence is whatever hasn't been done yet. (Larry Tesler)

• Weibull's Power Law: The logarithm of failure rates increases linearly with the logarithm of age. (Waloddi Weibull)

• Weinberg's Law: If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization. (Gerald M Weinberg)

• Wirth's Law: Software gets slower faster than hardware gets faster. (Nicklaus Wirth)

• Zawinski's Law: Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can. (Jamie Zawinski)

Sunday, November 2, 2008

21st Century

2005 We are approaching the dark ages point, when the rate of innovation is the same as it was during the Dark Ages. We'll reach that in 2024. (Jonathan Huebner)

2005 Within five years, every banking customer will have a banking terminal in their pocket, drawing on the mainframe. (Arif Mohamed)

2004 By the 2030s, the nonbiological portion of our intelligence will predominate. (Ray Kurzweil)

2004 The mainframe is not going to die, and will be around for basically forever. (John Swainson)

2004 By 2010, the average network connection speed to the home will be 10 times faster than today's [50Mbps] ADSL in Japan. (Kunio Nakamura)

2004 In five years, much of the business handled today by paper forms scanning and data capture will have moved to XML data transmitted over the Web. (Bruce Silver)

2004 Over the next couple of decades IT professionals will help workers integrate computing and communications onto and into their bodies and brains, with wearables and implants. (James Hughes)

2004 We're going to have computers, not too long from now, that don't have screens and where the information is presented as a hologram in the air above a keyboard. (Grant Evans)

2004 Ten years out you can almost think of hardware as being free. (Bill Gates)

2004 The future for IT is the same as it was for agriculture and manufacturing. (Joe Celko)

2004 Off-shoring is just another management fad and we're going to see it blow over. (Eric Raymond)

2004 A concept called ambient intelligence, where technology is embedded in our natural surroundings, ever present and available for access by the individual, will be accomplished within the next 25 years. (Laura Peters)

2003 Unix is dead. [see 1986] (Gus Robertson of Redhat)

2003 By 2014 the web may reach the level of user empowerment defined by the Macintosh in 1984. (Jakob Nielsen)

2003 By 2007, software systems will be developed and maintained through collaborative development environments, consisting of thousands of moving parts that are never turned off. (Grady Booch)

2003 IT doesn't matter. (Nicholas Carr)

2002 There will be a major cyber-terrorism event in 2003. It will be enough to disrupt the economy for a while, bring the Internet to its knees for a day or two. (John Gantz)

2002 I do think [in 20 years the global database] will exist, and I think it is going to be an Oracle database. And we’re going to track everything. (Larry Ellison)

2002 Linux will become the dominant server operating system in the United States by 2005. (Stacey Quandt)

2001 By 2009, computers will disappear. Displays will be written directly onto our retinas by devices in our eyeglasses and contact lenses. (Ray Kurzweil)

2001 We will need one million new people to be running these new e-businesses in the US in the next year alone. (John Gantz)

2001 We've had three major generations of computing: mainframes, client/server and Internet computing. There will be no new architecture for computing for the next 1,000 years. (Larry Ellison)

2000 Supercomputers will achieve one human brain capacity by 2010, and personal computers will do so by about 2020. (Ray Kurzweil)