Internet
Fact-checked

At EasyTechJunkie, we're committed to delivering accurate, trustworthy information. Our expert-authored content is rigorously fact-checked and sourced from credible authorities. Discover how we uphold the highest standards in providing you with reliable knowledge.

Learn more...

What is Lexical Analysis?

Jessica Susan Reuter
Jessica Susan Reuter

Lexical analysis is the process of taking a string of characters — or, more simply, text — and converting it into meaningful groups called tokens. This methodology has uses in a wide variety of applications, from interpreting computer languages to analysis of books. Lexical analysis is not synonymous with parsing; rather, it is the first step of the total parsing process, and it creates raw material for later use.

The building blocks of tokens, also called lexemes, can be generated in many ways, depending on the grammar required for lexical analysis. A common example of this is splitting sentences by words; this is frequently done by splitting sentences around spaces. Each continuous string of characters generated without spaces is a lexeme. Text strings can be split on one or many types of characters, creating multiple versions of lexemes with varying complexity. Tokens are generated after each lexeme has been evaluated and paired with its corresponding value; by definition, tokens refer to this pairing, not just the lexeme.

Lexical analysis is the process of taking a string of characters — or, more simply, text — and converting it into meaningful groups called tokens.
Lexical analysis is the process of taking a string of characters — or, more simply, text — and converting it into meaningful groups called tokens.

Lexical analysis, somewhat counter-intuitively, strips a text string of its context. Its purpose is only to generate building blocks for further study, not to determine if those pieces are valid or invalid. In the case of computer language interpretation, validation is done by syntax analysis, and validation of text can be done in terms of context or content. If an input string is completely divided into appropriate lexemes and each of those lexemes has an appropriate value, the analysis is considered to be successful.

Without context or the ability to perform validation, lexical analysis cannot be reliably used to find errors in input. A lexical grammar might have error values assigned to specific lexemes, and such analysis can also detect illegal or malformed tokens. Although finding an illegal or malformed token does signal invalid input, it has no bearing on whether the other tokens are valid, and so it is not strictly a type of validation.

Though lexical analysis is an integral part of many algorithms, it must often be used in conjunction with other methodologies to create meaningful results. For example, splitting a text string into words to determine frequencies makes use of lexeme creation, but lexeme creation alone cannot monitor the number of times a particular lexeme appears in input. Lexical analysis might be useful on its own if the lexemes themselves are of note, but large amounts of input might make analysis of raw lexemes difficult because of the volume of data.

Discuss this Article

Post your comments
Login:
Forgot password?
Register:
    • Lexical analysis is the process of taking a string of characters — or, more simply, text — and converting it into meaningful groups called tokens.
      By: Eyematrix
      Lexical analysis is the process of taking a string of characters — or, more simply, text — and converting it into meaningful groups called tokens.