Graham Kendall
Various Images

Professor Graham Kendall

Professor Graham Kendall is the Provost and CEO of The University of Nottingham Malaysia Campus (UNMC). He is also a Pro-Vice Chancellor of the University of Nottingham.

He is a Director of MyResearch Sdn Bhd, Crops for the Future Sdn Bhd. and Nottingham Green Technologies Sdn Bhd. He is a Fellow of the British Computer Society (FBCS) and a Fellow of the Operational Research Society (FORS).

He has published over 230 peer reviewed papers. He is an Associate Editor of 10 journals and the Editor-in-Chief of the IEEE Transactions of Computational Intelligence and AI in Games.

News

I am involved with a spin out company that specialises in Strategic Resource Planning
http://bit.ly/eTPZO2
Help solve Santa's logistics problems
http://bit.ly/1DXreuW

Latest Blog Post

How Isaac Newton could help you beat the casino at roulette

Random Blog Post

MISTA Conference: Proceeedings

Publication(s)

Scheduling English Football Fixtures over the Holiday Period Using Hyper-heuristics
http://bit.ly/eeIVyB
Imperfect Evolutionary Systems
http://bit.ly/hC4SYn
A Tabu Search Hyper-heuristic Approach to the Examination Timetabling Problem at the MARA University of Technology
http://bit.ly/gDSeQN
An efficient guided local search approach for service network design problem with asset balancing
http://bit.ly/fY7uch

Graham Kendall: Details of Requested Publication


Citation

Pandey, H.M; Chaudhary,; Mehrotra, D and Kendall, G Maintaining regularity and generalization in data using the minimum description length principle and genetic algorithm: Case of grammatical inference. Swarm and Evolutionary Computation, 31: 11-23, 2016.


Abstract

In this paper, a genetic algorithm with minimum description length (GAWMDL) is proposed for grammatical inference. The primary challenge of identifying a language of infinite cardinality from a finite set of examples should know when to generalize and specialize the training data. The minimum description length principle that has been incorporated addresses this issue is discussed in this paper. Previously, the e-GRIDS learning model was proposed, which enjoyed the merits of the minimum description length principle, but it is limited to positive examples only. The proposed GAWMDL, which incorporates a traditional genetic algorithm and has a powerful global exploration capability that can exploit an optimum offspring. This is an effective approach to handle a problem which has a large search space such the grammatical inference problem. The computational capability, the genetic algorithm poses is not questionable, but it still suffers from premature convergence mainly arising due to lack of population diversity. The proposed GAWMDL incorporates a bit mask oriented data structure that performs the reproduction operations, creating the mask, then Boolean based procedure is applied to create an offspring in a generative manner. The Boolean based procedure is capable of introducing diversity into the population, hence alleviating premature convergence. The proposed GAWMDL is applied in the context free as well as regular languages of varying complexities. The computational experiments show that the GAWMDL finds an optimal or close-to-optimal grammar. Two fold performance analysis have been performed. First, the GAWMDL has been evaluated against the elite mating pool genetic algorithm which was proposed to introduce diversity and to address premature convergence. GAWMDL is also tested against the improved tabular representation algorithm. In addition, the authors evaluate the performance of the GAWMDL against a genetic algorithm not using the minimum description length principle. Statistical tests demonstrate the superiority of the proposed algorithm. Overall, the proposed GAWMDL algorithm greatly improves the performance in three main aspects: maintains regularity of the data, alleviates premature convergence and is capable in grammatical inference from both positive and negative corpora.


pdf

You can download the pdf of this publication from here


doi

The doi for this publication is 10.1016/j.swevo.2016.05.002 You can link directly to the original paper, via the doi, from here

What is a doi?: A doi (Document Object Identifier) is a unique identifier for sicientific papers (and occasionally other material). This provides direct access to the location where the original article is published using the URL http://dx.doi/org/xxxx (replacing xxx with the doi). See http://dx.doi.org/ for more information


Journal Rankings


ISI Web of Knowledge Journal Citation Reports

The Web of Knowledge Journal Citation Reports (often known as ISI Impact Factors) help measure how often an article is cited. You can get an introduction to Journal Citation Reports here. Below I have provided the ISI impact factor for the jourrnal in which this article was published. For complete information I have shown the ISI ranking over a number of years, with the latest ranking highlighted.

2015 (2.963)

URL

This pubication does not have a URL associated with it.

The URL is only provided if there is additional information that might be useful. For example, where the entry is a book chapter, the URL might link to the book itself.


Bibtex

@ARTICLE{pcmk2016, author = {H.M. Pandey and A Chaudhary and D. Mehrotra and G. Kendall},
title = {Maintaining regularity and generalization in data using the minimum description length principle and genetic algorithm: Case of grammatical inference},
journal = {Swarm and Evolutionary Computation},
year = {2016},
volume = {31},
pages = {11--23},
abstract = {In this paper, a genetic algorithm with minimum description length (GAWMDL) is proposed for grammatical inference. The primary challenge of identifying a language of infinite cardinality from a finite set of examples should know when to generalize and specialize the training data. The minimum description length principle that has been incorporated addresses this issue is discussed in this paper. Previously, the e-GRIDS learning model was proposed, which enjoyed the merits of the minimum description length principle, but it is limited to positive examples only. The proposed GAWMDL, which incorporates a traditional genetic algorithm and has a powerful global exploration capability that can exploit an optimum offspring. This is an effective approach to handle a problem which has a large search space such the grammatical inference problem. The computational capability, the genetic algorithm poses is not questionable, but it still suffers from premature convergence mainly arising due to lack of population diversity. The proposed GAWMDL incorporates a bit mask oriented data structure that performs the reproduction operations, creating the mask, then Boolean based procedure is applied to create an offspring in a generative manner. The Boolean based procedure is capable of introducing diversity into the population, hence alleviating premature convergence. The proposed GAWMDL is applied in the context free as well as regular languages of varying complexities. The computational experiments show that the GAWMDL finds an optimal or close-to-optimal grammar. Two fold performance analysis have been performed. First, the GAWMDL has been evaluated against the elite mating pool genetic algorithm which was proposed to introduce diversity and to address premature convergence. GAWMDL is also tested against the improved tabular representation algorithm. In addition, the authors evaluate the performance of the GAWMDL against a genetic algorithm not using the minimum description length principle. Statistical tests demonstrate the superiority of the proposed algorithm. Overall, the proposed GAWMDL algorithm greatly improves the performance in three main aspects: maintains regularity of the data, alleviates premature convergence and is capable in grammatical inference from both positive and negative corpora.},
doi = {10.1016/j.swevo.2016.05.002},
issn = {2210-6502},
keywords = {Bit-masking oriented data structure, Context free grammar, Genetic Algorithm, Grammar induction, Learning algorithm, Minimum description length principle},
owner = {Graham},
timestamp = {2013.07.31},
webpdf = {http://www.graham-kendall.com/papers/pcmk2016.pdf} }