TECHTalksPro
  • Home
  • Business
    • Internet
    • Market
    • Stock
  • Parent Category
    • Child Category 1
      • Sub Child Category 1
      • Sub Child Category 2
      • Sub Child Category 3
    • Child Category 2
    • Child Category 3
    • Child Category 4
  • Featured
  • Health
    • Childcare
    • Doctors
  • Home
  • SQL Server
    • SQL Server 2012
    • SQL Server 2014
    • SQL Server 2016
  • Downloads
    • PowerShell Scripts
    • Database Scripts
  • Big Data
    • Hadoop
      • Hive
      • Pig
      • HDFS
    • MPP
  • Certifications
    • Microsoft SQL Server -70-461
    • Hadoop-HDPCD
  • Problems/Solutions
  • Interview Questions

Sunday, January 8, 2017

SQL Server Data Types:

 Chitchatiq     1/08/2017 06:07:00 PM     SQL Server     No comments   


SQL Server Data Types:

In SQL server, we have many data types and in this moment we no need to concentrate on all the data types. We will learn frequently used data types.
SQL Server Data Types:

What is data type?
Data types is a category which will define the characteristics of the data
For E.g. INT 

It will accept only Numeric values. String, decimal values are not allowed to store here.

I will try to relate some of the data types with some properties of person


Name
ABC
Here we will take char, varchar, nvarchar data types to store string values
DOB
01-01-1900
Date
Height
5.3
numeric, float, secimal
Weight
62
int, smallint, tinyint, bigInt
Birth date and time
01-01-1900,06:30:02
Datetime, datetime2,smalldatetime, datetimeoffset
Birth time
06:30:02
Time

Below is the data type and its details


Data Type
Max Size
Comments
Points to remember
char(size)
1-8000
Fixed-length, non-Unicode string data. The storage size is (size) bytes.
->When size is not specified in a data definition or variable declaration statement, the default length is 1. When size is not specified when using the CAST and CONVERT functions, the default length is 30.
->If we want to store Chinese, Japanese and other kind of special character then we need to use Unicode. If we use plain English then char and varchar would be enough
nchar(size)
1-4000
Fixed-length, Unicode string data. The storage size is 2x (size).
varchar(size) or varchar(max)
1-8000
Variable-length, non-Unicode string data.
nvarchar(size) or nvarchar(max)
1-4000
Variable-length, Unicode string data.
bit
1,0, NULL
It will take 1 bit
These are all fixed numeric
TINYINT
0 to 255
1 byte 
SMALLINT
-32,768 to 32,767
2 byte
INT
-2,147,483,648 to 2,147,483,647
4 byte
BIGINT
-9,223,372,036,854,775,808 to 9,223,372,036,854,775,807
8 bytes
DECIMAL(m,d)
1-18 (m for total digits and  d for total precisions 
Numeric data types that have fixed precision and scale.
NUMERIC(m,d)
1-18 (m for total digits and  d for total precisions
FLOAT(n)
N defaults to 53


DATE
0001-01-01' to '9999-12-31'.
'YYYY-MM-DD'
All these are related to date and time data types.

DATETIME
Date:
1753-01-01 00:00:00'
to '9999-12-31 23:59:59'
Time:
'00:00:00' to '23:59:59:997'
'YYYY-MM-DD hh:mm:ss[.mmm]'
DATETIME2
Date:
'0001-01-01' to '9999-12-31'
Time:
'00:00:00' to '23:59:59:9999999'
'YYYY-MM-DD hh:mm:ss[.fractional seconds]'
SMALLDATETIME
Date:
'1900-01-01' to '2079-06-06'
Time:
'00:00:00' to '23:59:59'
'YYYY-MM-DD hh:mm:ss'
TIME
Time:
'00:00:00.0000000' to '23:59:59.9999999'
'YYYY-MM-DD hh:mm:ss[.nnnnnnn]'
DATETIMEOFFSET
Along with Dateime2 it will display time zone
Time zone offset range from -14:00 to +14:00.
'YYYY-MM-DD hh:mm:ss[.nnnnnnn]' [{+|-}hh:mm]
Read More
  • Share This:  
  •  Facebook
  •  Twitter
  •  Google+
  •  Stumble
  •  Digg

SQL SERVER – Part 1 - What is Data,Database and DBMS

 Chitchatiq     1/08/2017 01:14:00 AM     SQL Server     No comments   

Before we learn about what SQL server is and how SQL will work, first we would like to know what are DATA, DATABASE, and DBMS keywords


Data:  information that has been translated into a form that it is more convenient to transfer or process

Database: It is a systematic collection of data that is stored in computer

DBMS: Database management system- Is a collection of programs which enables its users to access Database, Manipulate data and retrieve relevant information.

Read More
  • Share This:  
  •  Facebook
  •  Twitter
  •  Google+
  •  Stumble
  •  Digg

Saturday, January 7, 2017

Get Row Count Of All The Tables In SQL Server Database

 Chitchatiq     1/07/2017 03:27:00 PM     Problems&Solutions, SQL Server     No comments   


Generally most of the times, People who are working as Database Developers or DBA’s will come across to Get Row Count of all the tables in Database to check if there are any table is having 0 records (Or) to Get TOP 5 huge Row Counts tables for a particular Database (Or) To compare the Row counts between Source and Target database after Backup or Data load

Most of the times naïve (Inexperienced) People will try to use create SELECT COUNT(*) FROM [Table] for each table and then they will take the Row Counts for each table.

So for e.g. if we have a Database with 200 Tables then we have to write 200 SELECT COUNT(*) statements and then need to execute those statements. Finally we have to take those results into single place that could be EXCEL or any Flat file to see the each the table row counts. To complete all these things it will take around 1 hour.

We are all known time is so precious. Why can’t we do it in 2-3 minutes of the same task by using some of the SQL SERVER system tables?   

Here we go… 

SELECT
    sc.name +'.'+ ta.name TableName, SUM(pa.rows) RowCnt
FROM
    sys.tables ta
INNER JOIN sys.partitions pa
    ON pa.OBJECT_ID = ta.OBJECT_ID
INNER JOIN sys.schemas sc
    ON ta.schema_id = sc.schema_id
WHERE ta.is_ms_shipped = 0 AND pa.index_id <2
GROUP BY sc.name,ta.name
ORDER BY  RowCnt desc

Result:
query-to-get-row-count for all tables,








Tags: sql query to get row count for all tables, count number of rows in each table, count number of rows in each table


  


Read More
  • Share This:  
  •  Facebook
  •  Twitter
  •  Google+
  •  Stumble
  •  Digg

Monday, January 2, 2017

How to Fix ERROR: CANNOT PARALLELIZE AN UPDATE STATEMENT THAT UPDATES THE DISTRIBUTION COLUMNS Printable View

 Chitchatiq     1/02/2017 08:47:00 PM     Problems&Solutions, SQL Server     No comments   



Error: [DataDirect][ODBC Greenplum Wire Protocol driver][Greenplum]ERROR: Cannot parallelize an UPDATE statement that updates the distribution columns(File cdbmutate.c;Line 709;Routine apply_motion;)

[DataDirect][ODBC Greenplum Wire Protocol driver][Greenplum]ERROR: Cannot parallelize an UPDATE statement that updates the distribution columns
Cause: 
When Executing UPDATE Statement on Distribution Key we will see above errors. 

Solution:
As Greenplum will not allow to make Updates on Distribution Key. We have to update the logic to get  the required values through SELECT



Read More
  • Share This:  
  •  Facebook
  •  Twitter
  •  Google+
  •  Stumble
  •  Digg
Newer Posts Older Posts Home

Popular Posts

  • FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.
    Error: Resolution:         1.       Check if Your table storage format is ORC         2.      Check if yo...
  • Greenplum Best Practises
    Best Practices: A distribution key should not have more than 2 columns, recommended is 1 column. While modeling a database,...
  • Greenplum System-Useful Queries
    Useful Queries: Query to verify the list of segments in a Greenplum system select * from gp_segment_configuration; R...
  • How to fix ERROR 1045 : Access denied for user 'root'@'localhost' (using password: YES)
    Please follow below steps to fix HDP Sandbox -Mysql root password issue: Enter mysql -u root UPDATE mysql.user SET Password=PASSWOR...
  • HADOOP - HDFS OPERATIONS
    Starting HDFS To format the configured HDFS file system, execute the following command in namenode HDFS server, $ hadoop namenode ...

Facebook

Categories

Best Practices (1) Big Data (5) BigData&Hadoop (6) DAG (1) Error 10294 (1) external tables (1) File Formats in Hive (1) Greenplum (3) Hadoop (5) Hadoop Commands (1) Hive (4) Internal tables (1) interview Questions (1) Managed tables (1) MySQL Installation (1) ORCFILE (1) org.apache.hadoop.hive.ql.exec.MoveTask (1) Powershell (1) Problems&Solutions (15) RCFILE (1) return code 1 (1) SEQUENCEFILE (1) Service 'userhome' (1) Service 'userhome' check failed: java.io.FileNotFoundException (1) SQL Server (27) sqoop (2) SSIS (1) TEXTFILE (1) Tez (1) transaction manager (1) Views (1) What is Hadoop (1)

Blog Archive

  • December (1)
  • November (1)
  • October (2)
  • September (6)
  • August (1)
  • July (3)
  • March (1)
  • February (8)
  • January (4)
  • December (9)
  • August (4)
  • July (1)

Popular Tags

  • Best Practices
  • Big Data
  • BigData&Hadoop
  • DAG
  • Error 10294
  • external tables
  • File Formats in Hive
  • Greenplum
  • Hadoop
  • Hadoop Commands
  • Hive
  • Internal tables
  • interview Questions
  • Managed tables
  • MySQL Installation
  • ORCFILE
  • org.apache.hadoop.hive.ql.exec.MoveTask
  • Powershell
  • Problems&Solutions
  • RCFILE
  • return code 1
  • SEQUENCEFILE
  • Service 'userhome'
  • Service 'userhome' check failed: java.io.FileNotFoundException
  • SQL Server
  • sqoop
  • SSIS
  • TEXTFILE
  • Tez
  • transaction manager
  • Views
  • What is Hadoop

Featured Post

TOP 100 SQL SERVER INTERVIEW QUESTIONS

SQL SERVER INTERVIEW QUESTIONS 1.       What is the Complex task that you handled in your project 2.       What are the diffe...

Pages

  • Home
  • SQL SERVER
  • Greenplum
  • Hadoop Tutorials
  • Contact US
  • Disclaimer
  • Privacy Policy

Popular Posts

  • FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.
    Error: Resolution:         1.       Check if Your table storage format is ORC         2.      Check if yo...
  • Greenplum System-Useful Queries
    Useful Queries: Query to verify the list of segments in a Greenplum system select * from gp_segment_configuration; R...

Copyright © TECHTalksPro
Designed by Vasu