[go: up one dir, main page]

Skip to content

This application calculates the entropy of a string. The focus of this implementation is represented by a specialized function called "entropy" which receives a text sequence as a parameter and returns a value that represents the entropy. Entropy is a measure of the uncertainty in a random variable.

License

Notifications You must be signed in to change notification settings

Gagniuc/Entropy-of-strings

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Entropy-of-strings

The current JS application calculates the entropy of a string. The focus of this implementation is represented by a specialized function called "entropy" which receives a text sequence as a parameter and returns a value that represents the entropy. Entropy is a measure of the uncertainty in a random variable. In the context of information theory the term "Entropy" refers to the Shannon entropy:

Entropy

Which can also be written as:

Entropy

Where n represents the total number of symbols in the alphabet of a sequence and pi represents the probability of occurrence of a symbol i found in the alphabet. A step-by-step version of the entropy calculation is also shown here. For more detailed information on entropy please see the specialized chapter from the book entitled Algorithms in Bioinformatics: Theory and Implementation.

function entropy(c){

	//ALPHABET DETECTION
	var a = [];
	var t = c.split('');
	var k = t.length;

	for(var i=0; i<=k; i++){
		var q = 1;
		for(var j=0; j<=a.length; j++){
			if (t[i] === a[j]) {q = 0;}
		}
		if (q === 1) {a.push(t[i]);}
	}

	var e = 0;
	var r = '';
	var l = '';

	for(var i=0; i<=a.length-1; i++){
				
		r = c.replace(new RegExp(a[i], 'g'),'').length;
					
		l = a[i];
		a[i]=(k-r)/k;

		//e += -(a[i]*Log(2,a[i]));
		e += (a[i]*Log(2,(1/a[i])));
	}

	return e;
}

function Log(n, v) {
	return Math.log(v) / Math.log(n);
}

Live demo: https://gagniuc.github.io/Entropy-of-strings/

screenshot

References

  • Paul A. Gagniuc. Algorithms in Bioinformatics: Theory and Implementation. John Wiley & Sons, Hoboken, NJ, USA, 2021, ISBN: 9781119697961.

About

This application calculates the entropy of a string. The focus of this implementation is represented by a specialized function called "entropy" which receives a text sequence as a parameter and returns a value that represents the entropy. Entropy is a measure of the uncertainty in a random variable.

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published

Languages