Judging at the Fair 2020

Judging at Your Fair

The orientation meetings will be held in the Blue Cross Blue Shields of South Carolina Hall (Room 123) at the Darla Moore School of Business, next door to the USC Coliseum.  The meeting for the Junior Division will begin at 9:00 AM and the meeting for the Senior Division will begin at 1:30 PM.




Every Intel ISEF affiliated fair has its own methodology for judging projects at their fair. We provide the following tips and judging criteria as suggested aids in your process. The following points may be of value to you and your judges as they go out to review and score projects.


  • Examine the quality of the Finalist’s work, and how well the Finalist understands his or her project and area of study. The physical display is secondary to the student’s knowledge of the subject. Look for evidence of laboratory, field or theoretical work, not just library research or gadgeteering.
  • Judges should keep in mind that competing in a science fair is not only a competition, but an educational and motivating experience for the students. The high point of the Fair experience for most of the students is their judging interviews.
  • Students may have worked on a research project for more than one year. However, for the purpose of judging, ONLY research conducted within the current year is to be evaluated. Although previous work is important, it should not unduly impact the judging of this year’s project. 
  • As a general rule, judges represent professional authority to Finalists. For this reason, judges should use an encouraging tone when asking questions, offering suggestions or giving constructive criticism. Judges should not criticize, treat lightly, or display boredom toward projects they personally consider unimportant.  Always give credit to the Finalist for completing a challenging task and/or for their success in previous competitions.
  • Compare projects only with those competing at this Fair and not with projects seen in other competitions or scholastic events.
  • It is important in the evaluation of a project to determine how much guidance was provided to the student in the design and implementation of his or her research. When research is conducted in an industrial or institutional setting, the student should have documentation, most often the Intel ISEF Form 1C, that provides a forum for the mentor or supervisor to discuss the project. Judges should review this information in detail when evaluating research.
  • Please be discreet when discussing winners or making critical comments in elevators, restaurants, or elsewhere, as students or adult escorts might overhear.  Results are confidential until announced at the awards ceremony.


  • Provide the students with a brief explanation of the judging process. Provide information such as the rules for student conduct and attendance, the estimated number of judging interviews to expect, and any information possible about the levels or tiers of judging taking place. 
  • Provide an explanation to judges and students about the different types of judging and any rules for each type of judge. Many fairs, including the Intel ISEF, have both category (or grand award) judging and special award judging. Category judging is considered the primary judging process that provides the place winners of the Fair and the special award judging is most often done by the professional scientific organizations, colleges and universities, or governmental agencies who sponsor their award. Understanding who is on the floor helps everyone work together.
  • Take all steps possible to provide a just and equitable judging process without bias. Develop a judges’ code of conduct and a clearly defined set of criteria that your Fair judges must follow. Have procedures in place to eliminate any potential conflict of interest and always have a sufficient number of Fair representatives available during judging to handle any problems that may arise.


Evaluation Criteria for Category Judging

The criteria and questions below are used by the Grand Awards Judges of the Intel ISEF and is suggested as a guide for your category judging. Scientific Thought and Engineering Goals are separated into IIa. and IIb. to be used appropriately by category.  There are also added questions for team projects.

I.  Creative Ability (30 points)

  1. Does the project show creative ability and originality in the questions asked?
  2. Creative research should support an investigation and help answer a question in an original way.
  3. A creative contribution promotes an efficient and reliable method for solving a problem. When evaluating projects, it is important  to distinguish between gadgeteering and ingenuity.

II a.  Scientific Thought  (30 points)

For an engineering project, or some projects in categories such as computer science and mathematical sciences, the more appropriate questions are those found in IIb.  Engineering Goals.

  1. Is the problem stated clearly and unambiguously?
  2. Was the problem sufficiently limited to allow plausible approach? Good scientists can identify important problems capable of solutions.
  3. Was there a procedural plan for obtaining a solution?
  4. Are the variables clearly recognized and defined?
  5. If controls were necessary, did the student recognize their need and were they correctly used?
  6. Are there adequate data to support the conclusions?
  7. Does the Finalist or team recognize the data’s limitations?
  8. Does the Finalist/team understand the project’s ties to related research?
  9. Does the Finalist/team have an idea of what further research is warranted?
  10. Did the Finalist/team cite scientific literature, or only popular literature (i.e., local newspapers, Reader’s  Digest).

II b. Engineering Goals (30 points)

  1. Does the project have a clear objective?
  2. Is the objective relevant to the potential user’s needs?
  3. Is the solution workable?  acceptable to the potential user?  economically feasible?
  4. Could the solution be utilized successfully in design or construction of an end product?
  5. Is the solution a significant improvement over previous alternatives?
  6. Has the solution been tested for performance under the conditions of use?

III.  Thoroughness (15 points)

  1. Was the purpose carried out to completion within the scope of the original intent?
  2. How completely was the problem covered?
  3. Are the conclusions based on a single experiment or replication?
  4. How complete are the project notes?
  5. Is the Finalist/team aware of other approaches or theories?
  6. How much time did the finalist or team spend on the project?
  7. Is the finalist/team familiar with scientific literature in the studied field?

IV.  Skill (15 points)

  1. Does the finalist/team have the required laboratory, computation, observational and design skills to obtain supporting data?
  2. Where was the project performed (i.e., home, school laboratory, university laboratory)?  Did the student or team receive assistance from parents, teachers, scientists or engineers?
  3. Was the project completed under adult supervision, or did the student/team work largely alone?
  4. Where did the equipment come from?  Was it built independently by the Finalist or team?  Was it obtained on loan?  Was it part of a laboratory where the Finalist or team worked?

V.  Clarity (10 points)

  1. How clearly does the Finalist discuss his/her project and explain the purpose, procedure, and conclusions?  Watch out for memorized speeches that reflect little understanding of principles.
  2. Does the written material reflect the Finalist’s or team’s understanding of the research?
  3. Are the important phases of the project presented in an orderly manner?
  4. How clearly is the data presented?
  5. How clearly are the results presented?
  6. How well does the project display explain the project?
  7. Was the presentation done in a forthright manner, without tricks or gadgets?
  8. Did the Finalist/team perform all the project work, or did someone help?

For further guidelines please see the ISEF Judging Guide