Speaking about Text Mining & Sentiment Analysis, at SQL Saturday # 826, Victoria BC – Mar 16, 2019

Does your Enterprise data include from social media posts, product reviews and survey results with free form text? Feel like you need a Data Scientist, equipped with Tools like R, text mining & sentiment lexicon to unlock the value of that data? Not anymore! Thanks to our friends at Microsoft, a developer or analyst can leverage Azure Cognitive services and PowerBI to easily analyze text data and create effective visual dashboards.

Join me at SQL Saturday # 826 in beautiful Victoria, BC on Mar 16, 2019 and lets talk about Data Science techniques for text mining & Sentiment Analysis.  This session will walk through how to use text analytics APIs in Azure cognitive services, to analyze free form text responses to survey questions. We will learn how to parse key phrases, derive sentiment scores and how to effectively use Power BI visualizations like Word Cloud, Gauge, Histograms, etc. to create a Dashboard for effectively analyzing this data

SQLSaturday is a free training event for Microsoft Data Platform professionals and those wanting to learn about SQL Server, Business Intelligence and Analytics. This event will be held on Mar 16 2019 at Camosun College, Lansdowne Campus – Young Building, 3100 Foul Bay Rd, Victoria, British Columbia, V8P 5J2, Canada . Please register to attend this full day of free training.

Session Goals:

  • Understand how to use Azure Cognitive Services APIs for analyzing text data
  • Learn how to load and prepare your data in Power BI and augment it by invoking Azure Cognitive Services APIs directly from PowerBI
  • Learn various relevant data visualization techniques for building effective dashboards for qualitative text data
  • Sentiment Classification and word association using R

Speaking about Text Mining & Sentiment Analysis, at the St.Louis SQL Server User Group meeting – Jan 9, 2018

Does your Enterprise data include from social media posts, product reviews and survey results with free form text? Feel like you need a Data Scientist, equipped with Tools like R, text mining & sentiment lexicon to unlock the value of that data? Not anymore! Thanks to our friends at Microsoft, a developer or analyst can leverage Azure Cognitive services and PowerBI to easily analyze text data and create effective visual dashboards.

Join me at the St.Louis SQL Server User group meeting on Jan 9, 2018 and lets talk about Data Science techniques for text mining & Sentiment Analysis.  This session will walk through how to use text analytics APIs in Azure cognitive services, to analyze free form text responses to survey questions. We will learn how to parse key phrases, derive sentiment scores and how to effectively use Power BI visualizations like Word Cloud, Gauge, Histograms, etc. to create a Dashboard for effectively analyzing this data

Please RSVP on EventBrite, so that we can have a count for a dinner order,

https://www.eventbrite.com/e/stlssug-bi-january-9th-2018-meeting-tickets-20069581659

Session Goals:

  • Understand how to use Azure Cognitive Services APIs for analyzing text data
  • Learn how to load and prepare your data in Power BI and augment it by invoking Azure Cognitive Services APIs directly from PowerBI
  • Learn various relevant data visualization techniques for building effective dashboards for qualitative text data
  • Learn to leverage Natural language queries in PowerBI service
  • Sentiment Classification using R

Speaking about DACPAC Deployments at the St.Louis Dev up Conference, Oct 21st 2016

The world of SQL Server Development has seen great improvements with the introduction of SQL Server Data Tools (SSDT), which allows you to generate the DACPAC Deployment artifact. You can deploy these DACPACs to your SQL Server databases, without ever having to generate or execute a SQL Script. These script-less deployments can make your database roll outs more robust, reliable and efficient.

I am presenting a session on this topic, at the Dev Up conference (formerly know as the St.Louis Days of .NET) in St.Louis – Missouri, on 21st Oct 2016.  This 3 day event at the Ameristar casino & Resorts, is the 9th annual edition of this conference. There are over 140 sessions on various topics, plus work shops and hands-on pre-compiler sessions. The conference offers plenty of opportunities to network with experts, peers and vendors.

I am looking forward to attend the  2016 Dev Up conference next week ! Hope to see you all there.

 

Speaking at the St.Louis Business Intelligence UG meeting on 2/9/2016

I am speaking at the St.Louis Business Intelligence User Group meeting on 2/9/2016. This user group meeting is focused on ETL topics. The Topic for my presentation is “ETL in DataWarehouse As a Service Enviornment”, and I will be briefly going over the current and future state of the ETL setup at Lumeris . We will also have a quick discussion on SQL Server Integration Service Catalog, Project Deployment Model and ISPACs. The click here to download the slide deck for my presentation from SlideShare.

I am very excited to attend this meeting and learn some great ETL stuff from other presenters, especially Brian Knight ! Hope to see a lot of the #STL #SQLFamily tomorrow.

Exploring MySQL – Part 1

I recently had an opportunity to work with MySQL for the first time, while volunteering to code for a charitable cause. Its been a year or so since I have worked with anything but Microsoft SQL Server, so it was interesting to revisit the challenges of working with an open source database.

At first, I started searching the internet for a “SQL Server Management Studio” like client tool to work with MySQL. This led to the “MySQL WorkBench” – which was easy to download & install and quite intuitive to learn. I was impressed with the built in data modeling features and familiar interfaces that allowed me to browse through database objects and run queries. However, after some more exploring around, I realized that in order to connect to a remote MySQL Database, my user/login must be set up to allow access from a client machine (there are options to allow a user/login to access a MySQL Database from any client machine).

So I went back to Google searching for answers, and came across “phpMyAdmin“, which did look quite promising. Its installation document however seemed much more complicated and needed several pre-requisites , including an Apache tomcat server. Drawing upon my memory trying to learn Hadoop, I figured there might be a nicely packaged distribution which bundles together all of the components I need for MySQL , similar to Cloudera or HortonWorks’s distribution of Hadoop. I was happy to discover “Xampp” – an easy to install Apache distribution containing MySQL, PHP & Perl and includes “phpMyAdmin”. However, this still left me unsure if it would solve my problem of trying to access a remote MySQL Database to create tables and stored procedures.

I got my breakthrough while talking to other volunteer coders on the project and got access to “cPanel” – a graphical web-based control panel that simplifies website and Server (including MySQL Databases) management. Logging into cPanel, I was glad to see a section for “Databases” that included the tools like “MySQL Database Wizard” & “phpMyAdmin”.

In the next part of this blog, I plan to write about creating a MySQL Database, adding users and creating database objects.

I used PowerShell !

Our production environment consists of   SQL Server 2008 R2 with several databases across multiple SQL Sever instances. We follow a somewhat old school approach to deployment, wherein once a project is past QA and in the Stage/UAT environment, we no longer create and deploy builds in a cumulative fashion. When bugs are found in Stage/UAT environment, the Builds to fix those bugs (iterative cycle) in Stage/UAT are preserved and deployed sequentially , as-is in Production as well. If we needed 10 iterations (hence 10 builds) to fix a bug in Stage/UAT, we will deploy the same 10 builds to Production sequentially !

The Problem:

This tediously meticulous approach to deployment guarantees the repetition of  the same successful path to deployment in production (in theory), that was taken in Stage/UAT environment. It leads to same quality of code being deployed to production, as was deployed to Stage/UAT and hence is expected to produced the same results (in theory). However, when the number of iterations needed to fix all bugs in Stage/UAT is large enough that we routinely end up with builds running into double digits. Efficiently and accurately deploying 10 plus builds to production, within a relatively short deployment window was starting to become a challenge for us (Our DBA is not only expected to log deployment results, but proceed with next script ONLY upon success of previous script).  While we were not ready to fully automate the execution of our deployment scripts via a batch run , we needed a command line method for deploying our SQL scripts relatively fast , where the execution messages are not only captured in a log file, but also displayed on the screen. This would  not only let our DBA identify if a script’s execution encountered any errors, without having to open up the log file, but also help execute the deployment faster than using a fully manual, SSMS based deployment approach.

The solution:

Our first attempt was using  SQLCMD to achieve a fair degree of automation and speed up the deployment time, by reducing manual work. I have a simple test script here with a few PRINT statements , one simple SELECT statement that executes successfully and another simple SELECT statement that fails due to non-existent table (to simulate a script failure scenario). Do take note that my script uses SQLCMD variable “:on error exit” ,which causes the batch to stop execution upon encountering an error . I have named the script quite creatively as “test.sql”.

USE Demo;
GO
:on error exit
PRINT N'Deploying Demo Script...';
GO
SELECT COUNT(*) FROM [dbo].[demo_order];
GO
PRINT N'Running query against non-existing table...';
GO
SELECT COUNT(*) FROM [dbo].[does_not_exist];
GO
PRINT N'This PRINT should not run as previous query errors and batch should exit...';
GO

When run in SSMS, this script produces the following output, and exits the batch upon encountering the first error as expected ;

Deploying Demo Script...
(1 row(s) affected)
Running query against non-existing table...
Msg 208, Level 16, State 1, Line 2
Invalid object name 'dbo.does_not_exist'.
** An error was encountered during execution of batch. Exiting.

The quickest way to automate the execution of my test script, is to use SQLCMD via the command line. Note the “-b” option used in my SQLCMD command string, which forces the termination of batch upon encountering errors. This is functionally similar to using “:on error exit” SQLCMD variable within the script itself. Here is the simple command line string ;

sqlcmd -S  WKS18176\SANIL_2012 -d Demo -b -i test.sql  -o test.sql.log.txt

When this SQLCMD command string is executed in the command prompt, it created the log file documenting the error message and the fact that batch was terminated .However, note that the command prompt screen shows no indication of success or failure of the script.

sqlcmd

Unless our DBA opens up the log file “test.sql.log.txt” for review, he cannot see the execution and error messages as seen below. (I could use the “type” command on the next line here but we prefer to have a single line command )

Changed database context to 'Demo'.
Deploying Demo Script...
-----------
 12
(1 rows affected)
Running query against non-existing table...
Msg 208, Level 16, State 1, Server WKS18176\SANIL_2012, Line 2
Invalid object name 'dbo.does_not_exist'.

This is where PowerShell came to our rescue. With minor modification to my SQLCMD command itself, and adding a PowerShell cmd-let, we were  able to not only log the execution messages into a file, but also display them on the PowerShell screen, without losing any functionality related to exiting the batch upon error.

sqlcmd -S  WKS18176\SANIL_2012 -d Demo -b -i test.sql  | Tee-Object -file test.sql.log.txt

Here is a screenshot of executing my test script via PowerShell.

powershell

This was my first time using PowerShell and I am impressed how quickly we were able to learn and use it. Over the next few weeks, I am going to take up exploring PowerShell and learn how I can apply it to ease some more of our automation pain points !

References:

Studying for MCSA: SQL Server 2012 – A 90 Day Journey


MCSA

 With only a 9 weeks left until the end of the year, I figured I should start getting serious about meeting some of my goals for 2013. I did successfully complete my first goal for 2013 – speaking at the PASS Summit last week. My second goal for the year is to achieve the MCSA: SQL Server 2012 certification.  Since I am Microsoft Certified (MCTS) on SQL Server 2008, I will be taking the two exams to transition to MCSA SQL Server 2012 – the 70-457 & the 70-458, instead of the usual 3 exams. However, in the absence of any training materials specifically geared towards these transition exams, I am using the standard training kit from Microsoft Press, to augment my real work experience on SQL Server 2012. The Microsoft Press training kit consists of 3 books, 1 each for exam 70-461, 70-462 & 70-463. These training kits are available on Amazon .  I also happen to have a subscription to PluralSight’s training library, which has over 8 hours of training videos for exam 70-461 by christopher Harrison and over 16 hours of training videos for exam 70-462 by Sean McCown .

I am hoping that the combination of these two learning resources will help me prepare for the MCSA: SQL Server 2012 exam in about 90 days ( just over 12 weeks). My training schedule is inspired by Microsoft’s “90 Days to MCSA: SQL Server 2012 Edition” program. I intend to continue writing a weekly blog post that documents my learning and progress towards achieving the MCSA: SQL Server 2012 Certification. Hope to hear a lot of feedback and support from #sqlfamily on this journey !

mcsa_books

I am speaking at the PASS Summit 2013 !

PASS_2013_SpeakingButton_250x250

I am honored and excited to be selected to speak at the PASS Summit 2013 in Charlotte, NC – Oct 15th through 18th ! I will be talking about “Database Unit testing” with Visual Studio.  This session highlights the importance of Unit Testing in the development life cycle of a Database application. Unit testing a Database application is definitely a lot more challenging than unit testing a VB.NET or C# application . Creating a consistent database test environment not only involves database code, but also the data itself. More often than not, due to the time and effort involved in creating a consistent database test environment, Unit Testing database code is rarely given a though upfront during the development. This usually leads to late discovery of bugs, that are expensive to fix as the development life cycle progresses. Visual Studio, with Database projects and more recently with SQL Server Data tools (SSDT), had made unit testing fairly  easy to implement. During the course of this session, we will touch base with the the concepts of Unit Testing and demonstrate the implementation of Unit tests for a Database project and an SSDT project in VSTS 2010 and VSTS 2012 respectively. If you have already implemented Database Unit test projects in VSTS 2010, we will also go through a demo for upgrading them to SSDT.

I have presented this session at several SQL Saturday events, User group meetings and regional conferences, and I am looking forward to bring to this session to the PASS Summit. I look forward to seeing you all at the Summit in October !

SQL Saturday # 236, St.Louis – A Review

STLSQLSATTeamPhoto

We had great SQL Saturday #236, the second annual St.Louis SQL Saturday event, on Aug 3. As always, PASS plays a big role helping make any SQL Saturday event successful by providing the necessary infrastructure to run the event.

We moved the 2013 event to a different facility this year, the Wool Center at SLU. SLU provided the venue for this year’s event, as well as a few of their staff members  to help us out on the day of the event. we could not have asked for any better. We definitely plan to continue host future St.Louis SQL Saturdays at SLU.

I would also like to thank the core team of organizers – Mike Lynn, Jay carter, Danielle Mayers and Kim Tessereau for putting in a lot of hard work to make this event possible. There’s also several volunteers who helped out at the registration desk, lunch line and classrooms, all of who deserve a big thanks. Organizing a successful SQL Saturday is definitely a team effort and I could not have asked for a better team for this event.  No SQL Saturday event is possible with out the the speakers who contribute their time and skills , to present at the event. The generous support of all event sponsors plays an equally important role.

Last but the not the least, all the attendees who took the time to attend this event on a Saturday and are passionate about learning as well the SQL Community, deserve a big round of applause as well.

As organizers of the event, we noted a few improvements that can help us make the 2014 event even better;

1. Event Date – Quite a few of our regular local speakers, as well as several potential attendees could not make it to the event due to vacation plans. Several SQL Saturday organizers from the mid-west region had similar experiences in the months of July and August. We are planning for a event date in the month for September, for the 2014 St.Louis SQL Saturday.

2. Communication of Event start time and SpeedPASS – Though the first lecture for the day started at 9:30 AM and the registration desk opened at 8:30 AM, we had several attendees show up for the event before 8 AM.  some of the sponsor representatives did not get the directions to the free parking lot We will definitely be  much more clearer with our communication in the future. On the bright side, over 60% of the attendees came in with a printed SpeedPASS, which help the registration process move smoothly.

3. Lunch – We seem to have erred on the side of caution again while ordering lunch for the attendees, volunteers and sponsors. While we donated the left over lunch boxes to the building staff, we intend to plan the lunch orders better for the 2014 event.

4. After party – We intend to explore a venue closer to the SLU campus, for the after party for the 2014 event.

5. Recommended Hotel – While we were unable to secure a discount at the nearby hotels for the 2013 event, we intent to start negotiating with these hotel earlier for the 2014 event.

Please follow these links to the view the pictures taken during this event :

Please feel free to send us your feedback and suggestions to make the St.Louis SQL Saturday event better !

Please feel free to send us your feedback for the event.

SQL Saturday #236, St.Louis – MO , Aug 3rd 2013

The second annual St.Louis SQL Saturday is coming up in less than 2 weeks, on Aug 3rd 2013 at the Wood Center on SLU Campus (3545 Lindell Blvd, Center for Workforce & Organizational Development- 2nd floor,St. Louis, MO 63103). The event is a full day of free SQL Server training, consisting of 20 sessions on topics like Database Administration, Business Intelligence, Application Development and professional Development. Free parking for the event is available in the SLU parking lot across Olive St (Theresa Lot). NO Parking passes are needed event attendees to park in this lot. Street parking is at your own risk. The City of St.Louis metered parking spots are no longer free on the weekends.From the parking lot, follow the yard signs for SQL Saturday. Please follow this link for the floor plan of the venue. This will help your familiarize your self with where the registration desk, classrooms and the facilities are. Please note that our event is on the 2nd floor of the building

  • Hotel Ignacio – 0.2 miles
  • Courtyard St.Louis Downtown (Marriot) – 1 mile
  • Pear Tree Inn Union Station (Drury) – 1.1 mile

On the Day of the event, the registration desk will open at 8:30 AM and the first lecture for the day starts at 9:30 AM. A few days before the event, all registered attendees will receive an email with a link to their SPEEDPASS. The first 50 attendees who sign in at the registration desk with a SPEEDPASS, will get a free SQL Saturday t-shirt. If any of your friends or colleagues are interested to attend the event, please do encourage them to register as soon as possible (on the spot registrations are accepted, but they lead to long lines and waiting times for attendees). Box lunches will be provided at the event for a nominal fee of $10, only to those attendees who pay the lunch fee in advance, before 29th July 2013 (an with payment link for lunch fees will be coming out shortly). We give our caterers the head count on the Tuesday before the event. During Lunch, several of our Gold level sponsors will be talking about their products and services, various classrooms. The “Women In Technology” Panel talk will also be held during the lunch break, in one of the classrooms. Please visit the event Schedule page for details.

The SQL Saturday event are made possible thanks to the contributions of speakers & volunteers, and generous support of the vendors. Please do thank the speakers and volunteers for all of  work they put in to host these events. Several sponsors set up booths at the event and offer raffle prizes to the attendees. Do stop by their booths – they are always excited to talk to you about their products and services. If you would like a chance to win one of their Raffle prizes, please drop your raffle tickets at their booths. All Raffle prizes will be drawn at the end of the day. You must be present to win, and each winner can win only one raffle prize. PluralSight has offered a raffle prize of a free one year subscription to their entire training library (worth $299), and your attendee tickets dropped at the registration desk will qualify you to participate in the raffle for this prize. There will be several vendor gifts as well as books to Raffle away. xSQL Software is offering all attendees of this event, a free license of their “xSQL Data Compare for SQL Server” ($349 value). Please follow this link for details. This offer is valid ONLY on Aug 3rd and 4th of 2012. We have planned for an informal gathering after the event (the after party) at Schlafly Bottleworks. Please note that the event organizers are only suggesting a venue for all the attendees get together. All individuals are responsible for finding their own tables and paying for their own food and beverages. Please see this link for after party details.

As usual, we do request all attendees to be respectful of the venue and their property. SLU has generously offered the use of their facilities for this event and we definitely want them to continue their support to our event, for many years to follow.

If you need any more reasons to convince your friends or co-workers to attend the St.Louis SQL Satutday, please read kathi’s blog on top 10 reasons to attend SQL Saturday . Hope to see you all on Aug 3rd 2013 for an awesome STL SQL Satuday !