Labels

Apache Hadoop (3) ASP.NET (2) AWS S3 (2) Batch Script (3) BigQuery (21) BlobStorage (1) C# (3) Cloudera (1) Command (2) Data Model (3) Data Science (1) Django (1) Docker (1) ETL (7) Google Cloud (5) GPG (2) Hadoop (2) Hive (3) Luigi (1) MDX (21) Mongo (3) MYSQL (3) Pandas (1) Pentaho Data Integration (5) PentahoAdmin (13) Polybase (1) Postgres (1) PPS 2007 (2) Python (13) R Program (1) Redshift (3) SQL 2016 (2) SQL Error Fix (18) SQL Performance (1) SQL2012 (7) SQOOP (1) SSAS (20) SSH (1) SSIS (42) SSRS (17) T-SQL (75) Talend (3) Vagrant (1) Virtual Machine (2) WinSCP (1)

Saturday, March 24, 2018

Merge Panda Dataframe and Remove NaN Records


Below method help developer to merge multiple dataframe with same number of columns into single dataframe

Assume we have dataframes : r1 and r2, and we need to ignore null records then you can use below command with dropna().

merged_df = pd.concat([r1, r2], axis=0).dropna()
merged_df.to_csv('output.csv', index=False, doublequote=False)

Split Strings in Bigquery Using REGEXP

Split Strings in Bigquery Using REGEXP

Assume that we have a bigquery column with values like below:

---------------------------------------------------------
pair
----------------------------------------------------------
television:100
mobile:250
driver: 110
----------------------------------------------------------

Expected Output
---------------------------------------------------------
Device                         | Cost
---------------------------------------------------------
television                    |100
mobile                        | 250
driver                          | 110
----------------------------------------------------------

Use below bigquery statements to split the column:

 CASE
      WHEN REGEXP_MATCH(pair,":") THEN REGEXP_EXTRACT(pair, r'(\w*):')
      ELSE pair
    END AS attribute_name,
    REGEXP_EXTRACT(pair, r'\:(.*)') AS attribute_value

Python Fundamental - Operators

#Save the below code as python file and execute to see output.

varA = 15
varB = 6

# 1. Addition operator
add_sample = varA + varB
print(add_sample)

# 2. Subtract operator
sub_sample = varA - varB
print(sub_sample)

# 3. Multiply operator
multiply_sample = varA * varB
print(multiply_sample)

# 4. Division operator
division_sample = varA / varB
print(division_sample)

#5. Add Assignment (usefull for loop statement, any one below method can be used)

add_sample += 3
print(add_sample)

add_sample = add_sample + 1
print(add_sample)

# Similarly for other operators, use operator sign befor equal to assign value:
# examples:  -=, *=, /=

#7 Modulus

mod_sample = varA % varB
print(mod_sample)


#8 exponentiation

exp_sample = varA ** 2
print(exp_sample)

# Note: Operator Rule
# BODMAS: Bracket Orders Division Multiple Addition Subtraction

Python Fundamental - Strings and Indexes Example

#Save the below code as python file and execute to see output.

# 1. Normal

strA = 'My First string in quotes'
strB = "My first string in double quotes"

print (strA + "; " + strB)

#2. Escape Sequence
# escA= "My "first" double quotes" (This will result in error)
escA = "My 'first' double quotes"
escB = "My \"first\" double quotes"
print( escA + "; " + escB)

#3 String Index
# Index starts at 0 in python
indA = strA[0]
indB = strA[5]
print("Print indexes: " + indA + "; " + indB)

#4 Slicing of Strings

strC = "Python"

sliceA = strC[:3] #gives first 3 characters
sliceB = strC[3:] #gives last 3 characters
sliceC = strC[2:4] #gives 3 and 4 characters
print("Print slice indexes: " + sliceA + "; " + sliceB + "; "+ sliceC)

Python Fundamental Variables and Datatypes - Examples

#Save the below code as python file and execute to see output.

# 1. Add a variable and assign datatype int

myInt = 5
print(myInt)

# 2. Add a variable and assign datatype float

myFloat = 5.5
print(myFloat)

 # 3. Add a variable and assign datatype boolean
myBool = True
print(myBool)

Sunday, January 21, 2018

Bigquery Data Load with Command Line


Data Load with Command Line

Data load using bq involves three types:
1. Empty (Default): It writes data into an empty table, if data already exists it throws error.
bq query  ---n=1000 --destination_table=<table_name> 'SELECT * FROM [project:dataset.source_table];'
2. Replace: It replace a current table with newly obtained data output. 
It involves loss of existing data in a destination table. 
Use it wisely to perform incremental load which involves update and inserts.
bq query  ---replace --destination_table=<table_name> 'SELECT * FROM [project:dataset.source_table];'
3. Append: It appends new records to the existing table. 
If same command is executed more than one time it will create duplicate records. 
Can be used for incremental load which involves only data insert. 
bq query  ---append_table --destination_table=<table_name> 'SELECT * FROM [project:dataset.source_table];'

SQL 2016 New Features


  • Query Store: The Query Store feature maintains a history of query execution plans with their performance data, and quickly identifies queries that have gotten slower recently, allowing administrators or developers to force the use of an older, better plan if needed.
  • Polybase: This feature will benefit you if your regular data processing involves dealing with a lot of large text files -- they can be stored in Azure Blob Storage or Hadoop, and queried as if they were database tables. 
  • Stretch Database: The basics of Stretch Database are that some part of your tables (configurable or automated) will be moved into an Azure SQL Database in the cloud in a secure fashion. When you query those tables, the query optimizer knows which rows are on your server and which rows are in Azure, and divides the workload accordingly. 
  • JSON Support: Providing the ability to quickly move JSON data into tables
  • Row Level Security: This restricts which users can view what data in a table, based on a function. SQL Server 2016 introduces this feature, which is very useful in multi-tenant environments where you may want to limit data access based on customer ID.
  • Always Encrypted: Always Encrypted is new functionality through the use of an enhanced client library at the application so the data stays encrypted in transit, at rest and while it is alive in the database.
  • In-Memory Enhancements: Optimally designed for high-speed loading of data with no locking issues or high-volume session state issues.