Automation QA Testing Course Content

ExtentReports in Selenium


Why do we need reporting?

Status to:
                1)Upper Management
                2)Development Team
                3)Product/Project Management
1)Representation of automation efforts outside of the team

2- Using reports we calculate the time taken by each test case that helps to calculate ROI(Return on Investment).

3- You can share automation reports with your team and clients as well to share the status of testing progress etc.

What goes in Reports?
        1)High level status of automation run
        2)Should clearly represent the Test Suite/Test Class
        3)Name of Test Classes, Test Methods
        4)Status of Test Methods (Success/Failed/Skipped)
        5)No need to put exception messages/other details of debugging

ChainTest:

A comprehensive reporting framework supporting multiple generators - static, email & realtime, historical analytics with ChainLP.

Generators:

What is a ChainTest generator?

A generator is responsible for creating output files based on the test results and configuration settings. It processes templates, saves necessary resources, and generates reports in a specified format.

How to enable generators?

ChainTest is still in active development, and only Java unit test frameworks are supported at present. Generators can be enabled via properties files located on classpath - for more information, see the supported plugins below.

Supported Test Frameworks

ChainLP

ChainLP (Chain-Long-Playing like LP Record) is a Java (Spring) server which packs the Angular frontend and is distributed as a Docker image. ChainLP is the framework component providing historical analytics.

Docker image is available from https://hub.docker.com/r/anshooarora/chaintest.

The recommended way to run ChainLP is with docker-compose. Currently, the supported databases are listed below but most RDBMS database should work.

The recommended way to run ChainLP is with docker-compose. Currently, the supported databases are listed below but most RDBMS database should work.

  • H2
  • MySQL
  • PostgreSQL
For each database, there is a separate docker-compose.yml available at chainlp/docker. H2 provides the most straight-forward way to test, but it is NOT recommended for production use.

# example
git clone https://github.com/anshooarora/chaintest.git
cd chaintest/chainlp/docker
docker compose -f docker-compose-h2.yml up
# h2
docker compose -f docker-compose-h2.yml up

# mysql
docker compose -f docker-compose-mysql.yml up

# posgres
docker compose -f docker-compose-postgres.yml up

ChainLP config


Use the host:port of the ChainLP server, include it in chaintest.properties along with the required value for chaintest.project.name.

The host:port combination is where the client (via plugin) will connect to and communicate over TCP. For more information how communication is established, look into ChainTestApiClient.


# (only the relevant bits shows below)

# chaintest configuration
chaintest.project.name=default

# generators:
## chainlp
chaintest.generator.chainlp.enabled=true
chaintest.generator.chainlp.host.url=<host:port>

ChainTest Configuration

JAVA:

# chaintest configuration chaintest.project.name=default # storage chaintest.storage.service.enabled=false # [azure-blob, aws-s3] chaintest.storage.service=azure-blob # s3 bucket or azure container name chaintest.storage.service.container-name= # generators: ## chainlp chaintest.generator.chainlp.enabled=true chaintest.generator.chainlp.class-name=com.aventstack.chaintest.generator.ChainLPGenerator chaintest.generator.chainlp.host.url=http://localhost/ chaintest.generator.chainlp.client.request-timeout-s=30 chaintest.generator.chainlp.client.expect-continue=false chaintest.generator.chainlp.client.max-retries=3 ## simple chaintest.generator.simple.enabled=true chaintest.generator.simple.document-title=chaintest chaintest.generator.simple.class-name=com.aventstack.chaintest.generator.ChainTestSimpleGenerator chaintest.generator.simple.output-file=target/chaintest/Index.html chaintest.generator.simple.offline=false chaintest.generator.simple.dark-theme=true chaintest.generator.simple.datetime-format=yyyy-MM-dd hh:mm:ss a chaintest.generator.simple.js= chaintest.generator.simple.css= ## email chaintest.generator.email.enabled=true chaintest.generator.email.class-name=com.aventstack.chaintest.generator.ChainTestEmailGenerator chaintest.generator.email.output-file=target/chaintest/Email.html chaintest.generator.email.datetime-format=yyyy-MM-dd hh:mm:ss a

Screenshots/Embeds


The chaintest-core client is the framework component that supports plugins to store embeds for each report. For example, with SimpleGenerator, the client saves all embeds relative to the report file in the resources folder.

For embeds to work with ChainLP, the client requires the following to be enabled:


# storage
chaintest.storage.service.enabled=true
# [azure-blob, aws-s3]
chaintest.storage.service=azure-blob
# s3 bucket or azure container name
chaintest.storage.service.container-name=

There is still some work to be done within this area but a few quick pointers if you're ready to explore further:

  • Storage has been tested to work with both azure-blob and aws-s3 if the bucket or container are public access enabled
  • For buckets/containers requiring auth, support for AWS S3 has been completed, and also supports pre-signed URLs when served to the frontend
  • Azure Blob is not fully supported (yet)
  • The client and ChainLP both use the AWS Credential Chain to authenticate against the bucket and store/access blob data
    • For ChainLP, the secrets can be configured via <host>/settings by clicking the Secrets tab

Is Docker required for all ChainTest reports?

  • Docker is not a requirement for any of the static reports (SimpleGenerator, EmailGenerator)
  • Docker is required to host ChainLP as it is only available as a Docker image

When is Docker required for ChainTest?

In ChainTest's context, Docker is required only if setting up ChainLP given one or more of the below requirements:

  • Comprehensive Dashboard: Ideal for generating historical analytics and consolidating multiple project reports in one place.
  • Quick Setup: If you want to avoid manually setting up dependencies, Docker provides a pre-configured environment.
  • Consistency: Ensures that the application runs the same way across different systems without dependency conflicts.
  • Testing Complete Functionality: If you want to test the entire ChainTest environment with all its features (not just static report generation), Docker simplifies the setup.
chainlp/docker/docker-compose-h2.yml:
version: '3'
services:
  chaintest:
    image: anshooarora/chaintest:latest
    container_name: chaintest
    environment:
      - "SPRING_PROFILES_ACTIVE=h2"
      - SPRING_DATASOURCE_URL=jdbc:h2:file:./data/db
      - SPRING_DATASOURCE_DRIVERCLASSNAME=org.h2.Driver
    ports:
      - 80:80
-----------------------------------------------------
chainlp/docker/docker-compose-mysql.yml:
version: '3'
services:
  chaintest:
    image: anshooarora/chaintest:latest
    container_name: chaintest
    environment:
      - "SPRING_PROFILES_ACTIVE=mysql"
      - SPRING_DATASOURCE_URL=jdbc:mysql://host.docker.internal:3306/chaintest?autoReconnect=true
      - SPRING_DATASOURCE_DRIVERCLASSNAME=com.mysql.cj.jdbc.Driver
      - SPRING_DATASOURCE_USERNAME=
      - SPRING_DATASOURCE_PASSWORD=
    ports:
      - 80:80
---------------------------------------------------------------------------
chainlp/docker/docker-compose-postgres.yml:
version: '3'
services:
  chaintest:
    image: anshooarora/chaintest:latest
    container_name: chaintest
    environment:
      - "SPRING_PROFILES_ACTIVE=postgres"
      - SPRING_DATASOURCE_URL=jdbc:postgresql://host.docker.internal:5432/chaintest
      - SPRING_DATASOURCE_DRIVERCLASSNAME=org.postgresql.Driver
      - SPRING_DATASOURCE_USERNAME=
      - SPRING_DATASOURCE_PASSWORD=
    ports:
      - 80:80
-----------------------------------------------------------------
enter below dependencies in pom.xml
<!-- https://mvnrepository.com/artifact/com.aventstack/chaintest-testng -->
<dependency>
    <groupId>com.aventstack</groupId>
    <artifactId>chaintest-testng</artifactId>
    <version>1.0.5</version>
</dependency>
-------------------------------------------------------
<!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-api -->
<dependency>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-api</artifactId>
    <version>2.0.16</version>
</dependency>
----------------------------------------------------------------------------
<!-- https://mvnrepository.com/artifact/ch.qos.logback/logback-classic -->
<dependency>
    <groupId>ch.qos.logback</groupId>
    <artifactId>logback-classic</artifactId>
    <version>1.5.16</version>
    <scope>test</scope>
</dependency>
-------------------------------------------------------------------------
@Listeners(ChainTestListener.class)
public class BaseTest{
-----------------------------------------------------------------
<listeners>
      <listener  class-name="com.aventstack.chaintest.plugins.ChainTestlistener"></listener>
</listeners>

---------------------------------------------------------------------------------------------

   ChainPluginService.getInstance().addSystemInfo("Build#", "1.0");
        ChainPluginService.getInstance().addSystemInfo("Owner Name#", "Ramesh Ch");

//adding screenshots to the test method
  @AfterMethod
    public void attachScreenshot(ITestResult result){
        if(!result.isSuccess()){
            ChainTestListener.embed(takeScreenshot(), "image/png");
        }
    }

    /**
     * takescreenshot
     * @return
     */

    public byte[] takeScreenshot(){
       return ((TakesScreenshot)(driver)).getScreenshotAs(OutputType.BYTES);
    }
//add logs in your testcases
   ChainTestListener.log("actual title : ====>" + actualTitle);
 ChainTestListener.log("actual URL : ====>" + actualURL);
  ChainTestListener.log("----checking logo displayed or not...");

------------------------------------------------------------------------------------------------------


Why Extent Reports?
      1) Beautiful-looking reports
        2)Automation Framework Independent
        3) Easy-to-use APIs
        4)Provides a dashboard for the entire run
        5)Attach Screenshots
        6)Customizable
For More Information on extent reports refer linkExtentReportDocumentation

------------------------------------------------------------------------------------------------------------------
ExtentMnager.Java

package com.qa.linkedin.listeners;
import java.io.File;
import java.util.Date;

import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;

import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.reporter.ExtentHtmlReporter;
import com.aventstack.extentreports.reporter.configuration.Theme;
import com.qa.linkedin.util.Constants;
import com.qa.linkedin.util.TestUtil;


public class ExtentManager {
  private static final Logger log = LogManager.getLogger(ExtentManager.class.getName());
    private static ExtentReports extent;

    public static ExtentReports getInstance() {
        if (extent == null) {
            createInstance();
        }
        return extent;
    }
public static synchronized ExtentReports createInstance() {
        String fileName = TestUtil.getReportName();
        String reportsDirectory = Constants.REPORTS_DIRECTORY;
        new File(reportsDirectory).mkdirs();
        String path = reportsDirectory + fileName;
        log.info("*********** Report Path ***********");
        log.info(path);
        log.info("*********** Report Path ***********");
        ExtentHtmlReporter htmlReporter = new ExtentHtmlReporter(path);    htmlReporter.config().setTheme(Theme.STANDARD);
    htmlReporter.config().setDocumentTitle("Linkedin Automation Test Run");
    htmlReporter.config().setEncoding("utf-8");
    htmlReporter.config().setReportName(fileName);
    extent = new ExtentReports();
    extent.setSystemInfo("Organization", "RameshQaAutomationPlatform");
    extent.setSystemInfo("Automation Framework", "Selenium Webdriver");
    extent.setSystemInfo("Automation Tester", "Ramesh");
    extent.setSystemInfo("Build no", "QA-1234");
    extent.attachReporter(htmlReporter);
    
    return extent;
}

}
=================================================
ExtentReportListener:
----------------------------------------------------------------------------
package com.qa.linkedin.listeners;
import java.io.IOException;
import java.util.Arrays;
import java.util.Calendar;
import java.util.Date;

import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.openqa.selenium.WebDriver;
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestNGMethod;
import org.testng.ITestResult;
import org.testng.Reporter;
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.MediaEntityBuilder;
import com.aventstack.extentreports.Status;
import com.aventstack.extentreports.markuputils.ExtentColor;
import com.aventstack.extentreports.markuputils.Markup;
import com.aventstack.extentreports.markuputils.MarkupHelper;
import com.qa.linkedin.base.TestBase;
import com.qa.linkedin.base.WebDriverFactory;
import com.qa.linkedin.util.BasePageWebActions;
import com.qa.linkedin.util.TestUtil;

public class ExtentReportListener extends TestBase implements ITestListener {
    private static ExtentReports extentReports = ExtentManager.getInstance();
    private static ThreadLocal<ExtentTest> extentTest = new ThreadLocal<ExtentTest>();
    private static final Logger log = LogManager.getLogger(ExtentReportListener.class.getName());

    /**
     * Invoked after the test class is instantiated and before
     * any configuration method is called.
     *
     * @param context
     */
    public void onStart(ITestContext context) {
        log.info("onStart -> Test Tag Name: " + context.getName());
        ITestNGMethod methods[] = context.getAllTestMethods();
        log.info("These methods will be executed in this <test> tag");
        for (ITestNGMethod method: methods) {
            log.info(method.getMethodName());
        }
    }

    /**
     * Invoked after all the tests have run and all their
     * Configuration methods have been called.
     *
     * @param context
     */
    public void onFinish(ITestContext context) {
        log.info("onFinish -> Test Tag Name: " + context.getName());
        extentReports.flush();
    }
    /**
     * Invoked each time before a test method will be invoked.
     *
     * @param result
     * @see ITestResult#STARTED
     */
    public void onTestStart(ITestResult result) {
        ExtentTest test = extentReports.createTest(result.getInstanceName() + " :: "
                + result.getMethod().getMethodName());
        extentTest.set(test);
    }

    /**
     * Invoked each time a test method succeeds.
     *
     * @param result
     * @see ITestResult#SUCCESS
     */
    public void onTestSuccess(ITestResult result) {
        log.info("onTestSuccess -> Test Method Name: " + result.getName());
        String methodName = result.getMethod().getMethodName();
        String logText = "<b>" + "Test Method " + methodName + " Successful" + "</b>";
        Markup m = MarkupHelper.createLabel(logText, ExtentColor.GREEN);
        extentTest.get().log(Status.PASS, m);
    }

    /**
     * Invoked each time a test method fails.
     *
     * @param result
     * @see ITestResult#FAILURE
     */
    public void onTestFailure(ITestResult result) {
        log.info("onTestFailure -> Test Method Name: " + result.getName());
        String methodName = result.getMethod().getMethodName();
        String exceptionMessage = Arrays.toString(result.getThrowable().getStackTrace());
        extentTest.get().fail("<details>" + "<summary>" + "<b>" + "<font color=red>" +
                "Exception Occurred: Click to see details: " + "</font>" + "</b>" + "</summary>" +
        exceptionMessage.replaceAll(",", "<br>") + "</details>" + " \n");

        String browser = WebDriverFactory.getInstance().getBrowser();
        WebDriver driver = WebDriverFactory.getInstance().getDriver(browser);
        BasePageWebActions cd = new BasePageWebActions();
        String path = cd.takeScreenshot(result.getName(), browser);
        try {
            extentTest.get().fail("<b>" + "<font color=red>" +
                    "Screenshot of failure" + "</font>" + "</b>",
                    MediaEntityBuilder.createScreenCaptureFromPath(path).build());
        } catch (IOException e) {
            extentTest.get().fail("Test Method Failed, cannot attach screenshot");
        }

        String logText = "<b>" + "Test Method " + methodName + " Failed" + "</b>";
        Markup m = MarkupHelper.createLabel(logText, ExtentColor.RED);
        extentTest.get().log(Status.FAIL, m);
    }

    /**
     * Invoked each time a test method is skipped.
     *
     * @param result
     * @see ITestResult#SKIP
     */
    public void onTestSkipped(ITestResult result) {
        log.info("onTestSkipped -> Test Method Name: " + result.getName());
        String methodName = result.getMethod().getMethodName();
        String logText = "<b>" + "Test Method " + methodName + " Skipped" + "</b>";
        Markup m = MarkupHelper.createLabel(logText, ExtentColor.YELLOW);
        extentTest.get().log(Status.PASS, m);
    }

    /**
     * Invoked each time a method fails but has been annotated with
     * successPercentage and this failure still keeps it within the
     * success percentage requested.
     *
     * @param result <code>ITestResult</code> containing information about the run test
     * @see ITestResult#SUCCESS_PERCENTAGE_FAILURE
     */
    public void onTestFailedButWithinSuccessPercentage(ITestResult result) {
        // Ignore this
    }
}
-------------------------------------------------------------------

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.