Original article was published on Deep Learning on Medium

# Association Mining and various terminologies associated with it

Let’s continue our learning

In this blog we will discuss the very interesting and useful topic of machine learning i.e* Association Mining .*

## What is Association Mining?

Association mining is the technique of finding association between commodities . It tries to extract the relationships i.e association between different kinds of commodities . Let see what is –

**Association Rule Learning:**

Association rule learning is a machine learning method that uses a set of rules to discover interesting relations between variables in large databases i.e. the transaction database of a store. *It identifies frequent associations among variables called association rules that consists of an antecedent (if) and a consequent (then).*

## Objective of Association Mining

The task of associations rule learning is to discover the relationships and identify the rules of association.

Association rule learning algorithms are used extensively in data mining for ** market basket analysis**, which is determining dependencies among various products purchased by the customers at different times analyzing the customer transaction databases.

## Association rule mining algorithms

There are two basics algorithms for association learning or mining .

**Apriori Algorithm****Eclat Algorithm**

## Important terminologies

There are some important terminologies that are used by these algorithms which are as follows:-

**1.Support: **Support is the rate of the frequency of an item appears in the total number of items. Like the frequency of burger among all the transactions. Mathematically, for an item A will be given as –

**2**.**Confidence: **Confidence is the conditional probability of occurrence of a consequent (then) providing the occurrence of an antecedent (if). It’s kind of testing a rule. Like if a customer buys a burger(antecedent), he is supposed to buy french fries(consequent). Mathematically, the confidence of** B **given** A **will be given as-

**3.Lift: **Lift is the ratio of confidence and support. It tells how likely an item is purchased after another item is purchased. Simply it is the likelihood to buy french fries if a customer buys a burger. Mathematically, the lift is given as –

**Working of association mining algorithm i.e Apriori algorithm**

**Step 1:** Set a minimum support and confidence.

**Step 2:** Take all the subsets in transactions having support than minimum support.

**Step 3:** Take all the rules of these subsets having higher confidence than minimum confidence.

**Step 4:** Sort the rules by decreasing lift.

*Example for understanding the working of Apriori algorithm ias follows :*

Suppose we have the data record of some customer’s purchase from the supermarket.Our aim is to find possible associations from the given database.

First, we will calculate the frequency table for the itemset

**Step 1: **Let’s say we have set the minimum support and confidence to 15% , i.e 15% of 22 will be 3.3 . *That means no items having support less than 15% or frequency less than 3.3 will be incurred .*

Then we are left with the following items .

**Step 2: **Now our possible subsets for the above itemsets will be –

**Step 3:**Select the subsets having higher confidence and that is {Burger, French Fries} or* Burger → French Fries .*

**Step 4:** As we are left with only one rule we calculate the lift for this rule and that is approximately 3.7 .

**Applications of association rule mining**

- Supermarket items association analysis .
- Web mining
- Medical Analysis
- Bioinformatics
- Network Analysis
- Programming pattern finding , etc .

## Conclusion

In this blog I tried to cover the topics-*What is Association Mining , its objectives , association rule mining concept and its algorithms , important terminologies used in association mining , working of association mining algorithm with suitable example and it’s applications in the real world.*

This is all about Association Mining . Here comes the end of this blog.

**THANKS FOR YOUR VALUABLE TIME .**