Difference Between Varchar and Nvarchar

Varchar vs Nvarchar
 

The difference between varchar and nvarchar indicates how data are stored in a database. A database system consists of data and data is defined by data types. A data type tells that what kind of value a column may contain. Each column in a database table must have a name and a data type. Today, there are many data types available in database designing. Out of these data types, varchar and nvarchar are used to store string characters. Varchar and Nvarchar seem to be interchangeable. But these two types have different advantages, and they are used for different purposes.

What is Varchar?

As the name suggests, varchar is a varying character or varying char. Syntax of varchar is VARCHAR [(n|max)]. Varchar stores ASCII data that is non-Unicode data, and it is the data type which are used in normal usage. Varchar uses one byte per character. It also stores the length of each string in the database. Varchar has a variable data length and can store amaximum of 8000 non-Unicode characters. This data type is very flexible and will accept most different kinds of data. Varchar does not let you store blank characters for the unused parts of the string. The maximum storage size of varchar is 2 GB, and the real storage size of data is the actual length of data plus two bytes. Although varchar is slower than char, it uses dynamic memory allocation. Not only strings, but also non-string types such as date types, “February 14th”, ”12/11/2014” also can be stored in varchar data type.

Difference Between  Varchar and Nvarchar

What is Nvarchar?

Nvarchar suggests a national varying character or a national varying char. Syntax of nvarchar is NVARCHAR [(n|max)]. Nvarchar can store different types of data with varying length. They are Unicode data and multilingual data and languages with double-byte like characters in Chinese. Nvarchar uses 2 bytes per character, and it can store maximum limit of 4000 characters and a maximum length of 2 GB. Nvarchar treats “ ” as empty string and zero character length. Storage size is twice the number of characters size plus two bytes. In nvarchar, the trailing spaces are not removed when the value is stored and received.

What is the difference between Varchar and Nvarchar?

The key difference between varchar and nvarchar indicates how data are stored in a database.

• Varchar stores ASCII values and nvarchar stores Unicode characters.

• Varchar uses one byte per character while nvarchar uses two bytes per character.

• Varchar [(n)] stores non-Unicode characters with variable length and Nvarchar [(n)] stores Unicode characters with variable length.

• Varchar can store a maximum of 8000 non-Unicode characters and nvarchar stores maximum of 4000 Unicode or non-Unicode characters.

• Varchar is better to use in places where variables with non-Unicode characters are. Nvarchar is used in places where varibles with Unicode characters are.

• Storage size of varchar is number of bytes equal to the number of characters plus two bytes that is reserved for offset. Nvarchar uses number of bytes equal to the twice the number of characters plus two bytes that is reserved for offset.

• All modern operating systems and development platforms use Unicode internally. Therefore, nvarchar is highly used rather than varchar in order to avoid conversion of data types.

Summary:

Nvarchar vs Varchar

Varchar and nvarchar are variable length data types which we use to store different types of strings. These data types are helpful in modern operating systems. These varieties of data types avoid conversion of data from one type to another according to the operating systems. Therefore, varchar and nvarchar help the programmer to identify Unicode and non-Unicode strings without much difficulty. These two data types are very useful in programming.

 

Images Courtesy:

  1. Varchar by Caius Durling (CC BY 2.0)