Showing posts with label Career. Show all posts
Showing posts with label Career. Show all posts

April 4, 2008

A Good Practice


The Association for Software Testing (AST) is a professional organization dedicated to advancing the understanding and practice of software testing. The AST provides forums for academics, students, and testing practitioners to discuss testing. AST does this through online forums, workshops, education programs, and conferences. The third annual Conference of the Association for Software Testing (CAST) provides a great forum for face-to-face conferring. This is not your typical conference where experts talk at the masses. This is the software testing conference that puts the confer back in conference.

Ever sit in a presentation about testing and think anything like the following?

  • Yeah that works for you but it'll never work in my situation.
  • What do you mean by X?
  • She must work with idiots.
  • How does he know what he says? I want to see data.
  • My management would never go for it.
  • What planet is he from?
  • You're full of it.

Your are not only free to think these things at CAST, but you are free to question presenters. Time is built into the program for facilitated discussion of every presentation.

If you'd like to become a better software tester, join AST and come to CAST.

If you'd like to meet and confer with peers from around the world, join AST and come to CAST.

If you'd like to meet and confer with testing experts, join AST and come to CAST.

If you'd like to be challenged, join AST and and come to CAST.

If you'd like to hear Gerald Weinberg talk about the past, present, and future of software testing: join AST and come to CAST.

If you can't afford those other testing conferences, join AST and come to CAST -- its about half the price of other conferences.

If you'd like to compete against other testers, join AST and come to CAST.

If you care about software testing, join AST and come to CAST.


Only through judgment and skill,
exercised cooperatively throughout the entire project,
are we able to do the right things at the right times
to effectively test our products.
context-driven-testing.com


While I am not a believer in best practices, I believe its a really good practice to participate in CAST 2008. See you in Toronto.

November 17, 2007

Finally, a tester certification test that I might like!


I am not a fan of any of the current software tester certification programs. Perhaps it is because I take the word certification too literally -- which means that I expect it to have real meaning. When I think of certification, I usually think in lines with the IEEE's definition of certification.
certification

The process of confirming that a system or component complies with its specified requirements and is acceptable for operational use.
Perhaps my thinking is biased by my past IV&V testing work. If one can go to a weekend class and become certified, then I question the value of that certification. I believe that certifications based on ability to memorize terms and practices free of context are of little value -- and may do more harm than good. Yet, the purpose of this blog post is not to provide arguments against the current crop of tester certification options. If you'd like to see some concerns about certification, read the following.
If we must have a test certification based on a computer-scored multiple-false test, then I think I have found a test that is better than any others I've yet seen. This test covers many aptitudes and skills that I believe are important for testers:

  • Precision reading
  • Requirements interpretation
  • Persistence
  • Exploratory learning
  • Domain knowledge
  • Heuristic based problem solving
  • A good sense of humor
  • Critical analysis
  • Looking at problems from many angles
  • Recognizing context
  • Understanding the software platform
  • Agility
  • Troubleshooting
  • Working under pressure
  • A good memory, or good note taking skills
  • and don't be easily offended
Give it a try.



If I had to select testers based on passing a test, I think I'd take someone that has gotten further than I in this quiz over someone with a software tester certification. I believe this quiz is a better measure of whether or not one is "acceptable for operational use". :)

How far can you get?

And did you notice the bug on question 30?

October 5, 2007

Are you smarter than a 3rd grader?

"I guess you could say I like to figure out how stuff works, I just like new adventures."
- Carson Page, 8 year old junior beta tester
Carson Page, 8, junior beta tester.
Rodolfo Gonzalez
AMERICAN-STATESMAN

Good testers can be hard to find. It looks like Actel Corp has found a good one. He is young. He is smart. He has excellent growth potential. And he works cheap -- for now.

Check out these stories:
I suspect that this kid does not know many testing buzzwords. I suspect he doesn't know much about testing tools and processes. However, Carson knows how to ask "why?" and communicate with engineers.
"We would ask what he liked and didn't like about it and he could explain it on a very high-end level."
- Mark Nagel, Actel Corp, Field Applications Engineer

A tester that can think, ask questions, and communicate can go far.

August 4, 2007

Extreme Telecommuting


"Ten years ago, there's no way this would have worked. Now there are hardly any barriers."

- Anthony Page

Many of us spend most of our days trapped in a cubical or windowless office. At times I have enjoyed the opportunity to telecommute from home. I've had some good and bad home offices over the years. I've worked with great views and I've worked in basements. I'm a bit envious of James Bach's new digs.

I have the pleasure of working from home one day a week. I look forward to this day because I don't have to deal with traffic, I can work in the comfort of my own home, and I can get work done with fewer interruptions.

Earlier this week, I came across a CNN story about telecommuters that don't work from home. These telecommuters work from wherever they want to be. They are working globetrotters. Today's technology makes it possible for many people to work from anywhere in the world. I think we are still some time away from this being an option for many employees. However, it may be a viable option for contract work. If work can be outsourced to anywhere in the world, why not a beach or mountain top?

"People ask me where I live, and I'm not sure what to say, I'm not sure where I live. I live in the world."
- Trygve Inda

If you could be an extreme telecommuter, from where would you work?

July 12, 2007

Woodpeckers, Pinatas, and Dead Horses

Here's some short blurbs of a few things I took away from CAST sessions.

From Lee Copeland's keynote address:
  • "It's nonsensical to talk about automated tests as if they were automated human testing."
  • Write or speak about something you're knowledgeable and passionate about.
  • Combine things from multiple disciplines.

From Harry Robinson's keynote address:
  • Weinberg's Second Law: If Builders Built Buildings The Way Programmers Write Programs, Then The First Woodpecker That Came Along Would Destroy Civilization.

From Esther Derby's keynote:
  • To successfully coach someone, they must want to be coached and want to be coached by you.

From James Bach's tutorial:
  • Pinata Heuristic: Keep beating at it until the candy comes out. ... and stop once the candy drops.
unless ...
  • Dead Horse Heuristic: You may be beating a dead horse.
yet beware ...
  • If it is a pinata, don't stop beating at it until the candy drops; but if it is a dead horse, your beating is bringing no value. It can be a challenge to determine if its a pinata or a dead horse.
From Antti Kervinen's presentation:
  • Separate automation models into high level (behavior) and low level (behavior implementation) components to reuse test models on a variety of platforms and configurations.

More from James Bach's tutorial:
  • Testing does not break software. Testing dispels illusions.
  • Rational Unified Process is none of the three. (attributed to Jerry Weinberg)

From the tester exhibition:

  • Testing what can't be fixed or controlled may be of little value. Some things may not be worth testing.
  • There is great value in the diversity of approaches and skills on a test team.
  • It may be possible to beat a dead horse and test (and analyze) too much. Sometimes we should just stop testing and act on the information we have.

From Doug Hoffman's tutorial:
  • Record and playback automation can be very useful for testing for the same behavior with many configurations. And, once the script stops finding errors: throw it out.

From Keith Stobie's keynote:
  • Reduce the paths though your system to improve quality. Fewer features may be better.
  • Free web sites often have higher quality than subscription sites. This is because it is easy to measure the cost of downtime on ad-supported systems.

From David Gilbert's session:
  • People expect hurricanes to blow around and change path. We should expect the same with software development projects. (David has some interesting ideas about forecasting in software development.)
  • Numbers tell a story only in context. You must understand the story behind the numbers.
One more from James:
  • Keep Notes!


What did you take away from CAST?

July 10, 2007

Read any good books lately?


"I've never read a book about software testing."

- too many testers

In a CAST keynote address about recent innovations in software testing, Lee Copeland relayed a story about asking all the testers at a large respected financial company about their favorite software testing books. Lee said that every one of the testers said they had never read a book about software testing.

Lee compared this to a surgeon informing a patient that they've never read a book about surgery, but not to worry because they are a good surgeon.

I too have asked a number of testers about their training to be a tester and have often received responses similar to those reported by Lee.

I want to pass on Lee's encouragement to read. Lee also heralded the benefits of applying lessons learned outside technology fields to testing (e.g., philosophy and psychology) to software testing.

There was a time that there weren't many testing books from which to choose. This has changed. Today, there are many. There are some good books out there, but there are also some terrible books that promote practices that have not adapted to the past 30 years of advances in software development.

The list in the sidebar of this blog contains some books I've found useful in software testing. Inclusion in this list does not imply my endorsement of everything in the book. I don't necessarily agree with a book to like it. To me, a good book is one that makes me think.

What good testing books have you read?

June 6, 2007

Stupid Questions

Why do we drive on parkways and park in driveways?

Why do noses run and feet smell?

If Jimmy cracks corn and no one cares, why is there a song about him?

Why do we call them restrooms when no one goes there to rest?

Why do you have to click the "Start" button to stop Windows?

Most of my life, I have been told that there are no such things as stupid questions. This was usually said to encourage me, and others, to not be afraid to learn. However, I am beginning to think that there is such a thing as a stupid question. I don't mean questions like the above. Coming up with the questions above requires some thought and I suspect they all have reasonable answers. The above questions are more silly than stupid.

So what do I consider to be a stupid question? A stupid question is a question that has little basis in intelligent thought. A stupid question is a question without the context required to provide an answer. A stupid question is one that the questioner would have realized has no answer had they thought about it.
(adj) stupid: lacking or marked by lack of intellectual acuity

(noun) question: a sentence of inquiry that asks for a reply
Before I continue, I admit that I have asked my share of stupid questions. I am, however, alarmed at the large number of stupid questions that software testers are asking in Internet discussion forums and newsgroups.

Here are some paraphrases of stupid questions I've recently seen posted online:
  • How can all tests be automated?
  • What are the limitations of [commercial functional test tool]?
  • What is functional testing? I don't want a definition, I want complete details.
  • What is the industry standard response time for web applications?
  • How much test case detail is required?
  • What is the best automation tool?
  • How do I test a [development platform] application?
  • What is the [one and only] definition for [fuzzy testing term]?
  • How do I do software testing?
  • What is the standard tester to developer ratio?
  • What's the best testing technique?
  • What are the CMM procedures for a test team of more than n people?
  • What is the role of the QA team?
  • How do I create test data?
  • How can I do exhaustive testing?
  • What is the best way to find bugs?
  • How many types of bug resolutions are there?
  • Who decides if a bug is resolved?
  • What's the difference between a requirement and a specification?
  • What is the formula for [magic metric that measures testing value without context]?
Most of these questions are unanswerable because they lack context or are made with the assumption that there is one right context-free answer. These questions may lead to interesting discussions but are not answerable with one-size-fits-all solutions.
Don't get stuck on stupid, reporters. We're moving forward.
... You are stuck on stupid. I'm not going to answer that question.

- Gen. Russel Honore
Many of the "senior" testers in online discussion forums answer stupid questions with the tact of General Honore. They are not trying to be rude. Most are not arrogant. They are experienced. Many have learned through their own failure that there are no magic solutions for general questions. Most of the experienced testers I've interacted with online are very willing to help. They are very willing to answer intelligent questions -- even if they disagree with a premise of the question.

Testing software is a context-sensitive intellectual task. An important aspect of testing is working through ambiguity to find and test what really matters. Testing is not a purely technical domain for which single best ways of doing things can be defined and applied regardless of context. Testers need to think and ask intelligent questions.

I asked plenty of questions when I was new to testing. I was given boundaries in which to work and was given freedom to think and learn within those boundaries. I had some great mentors that taught me a great deal about testing. The mentors provided me with good documentation, answered questions, and exemplified good testing practices. Some of the wisdom of my early mentors did not become clear to me until after I failed on my own. Experience is a great teacher. Sometimes we can learn from other people's successes and failures. Sometimes we have to learn on our own.

If you are new to testing, please ask questions. If you don't understand a term or technical detail, please ask. If a requirement is not clear, please ask. If you don't understand the context, please ask. If you need help, please ask. There are plenty of people able and willing to assist other testers. It would be foolish to pretend to know what you are doing when you do not. Asking for help or clarification is not a sign of weakness, it is a sign of intelligence.
Being ignorant is not so much a shame, as being unwilling to learn.
- Benjamin Franklin
Before asking a broad question, think about it. Ask yourself if it is answerable. Do a little research. Provide some context. Show that you care about the question and the requested answer. Realize that the specificity of your question is directly related to the specificity of the answer. General questions are unlikely to have a single answer. When you get an answer, test it. Try to think of situations in which the answer does not apply. Consider what new problems are created by any solution to an existing problem.
By three methods we may learn wisdom:
First, by reflection, which is noblest;
Second, by imitation, which is easiest;
and third by experience, which is the bitterest.
- Confucius

Now, why do we drive on parkways?

May 16, 2007

Faking It

Pradeep Soundararajan recently posted a podcast about fake experience on resumes. [listen] This reminded me of an experience I had with a fake resume.

A colleague came to me, dropped a resume in my hand, and asked if I had worked for a company listed on the resume. I quickly scanned the resume and noticed a former employer listed in the experience. I checked the dates and discovered that they included a period that I worked for that company. I then read details that listed projects in which I had been intimately involved. However, I did not recognize the name at the top of the resume. I then called several people at that company and could not find anyone that knew this person.

The experience listed on the resume was fake. It was a lie.

Lying on your resume can come back to haunt you -- sometimes even many years down the road. Don't fall into that trap.


This blatant lie was easily caught. Even if I had not worked for the company listed on the resume, whether or not someone worked for a company is usually easy to check. Former employers may be unlikely to give details about what a person did and why they left, but they will generally confirm whether or not someone was an employee.

Faking it may get your foot in the door, but once you are in you still have to perform. The person that submitted this fake resume was interviewed. It was reported to me that it quickly became clear that the person did not have the amount of experience they claimed.

Job hunting can be tough. Faking it does not help. It only makes it tougher. Tell the truth.

The truth may hurt for a little while but a lie hurts forever.