Tuesday, August 23, 2016

Real World SQL and PL/SQL: Advice from the Experts



Because my hero is Cary Millsap, I'm going to do what he did and publish my foreword Preface. All joking aside, I consider myself incredibly fortunate to have been included in this project. I learned...a lot, by simply trying to find the author's mistakes (and there were not many). There was a lot more work than I expected, as well. (Technical) Editing is lot easier than writing, to be sure.

Brendan Tierney and Heli Helskyaho approached me in March 2015 about being an author on this book, along with Arup Nanda and Alex Nuijten. Soon after, we picked up Martin Widlake. To say that I was honored to be asked would be a gross understatement. Rather quickly though, I realized that I did not have the mental energy to devote to the project and didn’t want to put the other authors at risk. Still wanting to be part of the book, I suggested that I be the Technical Editor and they graciously accepted my new role.

This is my first official role as Technical Editor, but I’ve been doing it for years through work; checking my work, checking others work, etc. Having a touch of Obsessive Compulsive Disorder (OCD) helps greatly.

All testing was done with the pre-built Database App Development VM provided by OTN/Oracle which made things easy. Configuration for testing was simple with the instructions provided in those chapters that required it.

One of my biggest challenges was the multi-tenant architecture of Oracle 12c. I haven’t done DBA type work in a few years, so trying to figure out if I should be doing something in the root container (CDB) or the pluggable database (PDB) was fun. Other than that though, the instructions provided by the authors were pretty easy to follow.

Design (data modeling, Edition Based Redefinition, VPD), Security (Redaction/Masking, Encryption/Hashing), Coding (Reg Ex, PL/SQL, SQL), Instrumentation, and “Reporting” or turning that raw data into actionable information (Data Mining, Oracle R, Predictive Queries). These topics are covered in detail throughout this book. Everything a developer would need to build an application from scratch.

Probably my favorite part of this endeavor is that I was forced to do more than simply see if it works. Typically when reading a book, or blog entry, I’ll grab the technical solution and move on often skipping the Why, When, and Where. How, to me, is relatively easy. I read AskTom daily for many years, it was my way of taking a break without getting in trouble. At first, it was to see how particular solutions were solved, occasionally using it for my own problems. After a year or two, I wanted to understand the Why of doing it a certain way and would look for those responses where Tom provided insight into his approach.

That’s what I got reviewing this book. I was allowed into their minds, to not only see How they solved technical problems, but Why. This is invaluable for developer’s and DBAs. Most of us can figure out How to solve specific technical issues, but to reach that next level we need to understand the Why, When and Where. This book provides that.

Thursday, July 9, 2015

Kscope15 - It's a Wrap, Part II

Another fantastic Kscope in the can.

This was my final year in an official capacity which was a lot more difficult to deal with than I had anticipated. Here's my record of service:
  • 2010 (2011, Long Beach) - I was on the database abstract review committee run by Lewis Cunningham. I ended up volunteering to help put together the Sunday Symposium and with the help of Dominic Delmolino, Cary Millsap and Kris Rice, I felt I did a pretty decent job.
  • 2011 (2012, San Antonio) - Database track lead. I believe this is the year that Oracle started running the Sunday Symposiums. Kris again led the charge with some input from those other two from the year before, i.e. DevOps oriented
  • 2012 (2013, New Orleans) Content co-chair for the traditional stuff (Database, APEX, ADF), Interview Monkey (Tom Kyte OMFG!), OOW/ODTUG Coordinator, etc.
  • 2013 (2014, Seattle) Content co-chair for the traditional stuff (Database, APEX, ADF), Interview Monkey, OOW/ODTUG Coordinator, etc.
  • 2014 (2015, Hollywood, FL) Content co-chair for the traditional stuff (Database, APEX, ADF)

This has been a wonderful time for me both professionally and, more importantly to me, personally. Obviously I had a big voice in the direction of content. Also and maybe hard to believe, I actually presented for the first time. Slotted against Mr. Kyte. I reminded everyone of that too. Multiple times. It seemed to go well though. Only a few made fun of me.

I was constantly recruiting too. "Did you submit an abstract?" "No, why not?" and I'd go into my own personal diatribe (ignoring my own lack of presenting) into why they should present. Sarah Craynon Zumbrum summed it up pretty well in a recent article.

But it was the connections I made, the people I met, the stories I shared (#ampm, #cupcakeshirt, etc), and the friends that I made, that's what has had the most impact on me. Kscope is unique in that way because of it's size...at Collaborate or OOW, you'll be lucky to see someone more than once or twice, at Kscope you're running into everyone constantly.

How could I forget? #tadasforkate! This year was even more special. For those that don't know, Katezilla is my profoundly delayed but equally profoundly happy 10 y/o daughter. Just prior to the conference her physical therapist taught her "tada!" and Kate would hold her hands up high in the air and everyone around would yell, Tada! I got this crazy idea to ask others to do it and I would film it. Thirty or forty videos and hundreds of participants later...



So a gigantic thank you to everyone who made this possible for me.
Here's a short list of those that had a direct impact on me...
  • Lewis Cunningham - he asked me to be a reviewer which started all of this off.
  • Mike Riley - can't really say enough about Mike. After turning me away a long time ago (jerk), he was probably my biggest supporter over the years. (Remind me next year to you tell you about "The Hug."). Mike, and his family, are very dear to me.
  • Monty Latiolais (rhymes with Frito Lay I would tell myself) - How can you not love this guy?
  • Natalie Delemar - Co-chair for EPM/BI and then boss as Conference Chair.
  • Opal Alapat - Co-chair for EPM/BI and one of my favorite humans ever invented. I aspire to be more organized, assertive, and bad-ass like Opal.
That list is by no means exhaustive. It doesn't even include staff at YCC, like Crystal Walton, Lauren Prezby and everyone else there. Nor does it include the very long list of Very Special People I've met. I consider myself very fortunate and incredibly grateful.

What's the future hold?
I have no idea. My people are in talks with Helen J. Sander's people to do one or more presentations next year, so there's that. Speaking of which...it's in Chicago. Abstract submissions start soon, I hope you plan on submitting. If you're not ready to submit, I hope you take try to take part in shaping the content by finding one of about 10 abstract review committees. Who knows where they may lead you?

Finally, here's the It's a Wrap video from Kscope15 (see Helen's story there). Here's Kscope16's site. Go sign up.

Wednesday, April 22, 2015

Is MERGE a bug?

A few years back I pondered whether DISTINCT was a bug.

My premise was that if you are depending on DISTINCT to return a correct result set, something is seriously wrong with your table design. I was reminded of this again recently when I ran across Kent Graziano's post on Better Data Modeling: Are you making these 3 beginner mistakes in your data models?. Specifically:
Instead of that, you should be defining a natural, or business, key for every table in your system. A natural key is a an attribute or set of attributes (that occur naturally in the data set) required to uniquely identify a row in that table. In addition you should define a Unique Key Constraint on those attributes in the database. Then you can be sure you will not get any duplicate data into the tables.

CLARIFICATION: This point has caused a lot of questions and comments. To be clear, the mistake here is to have ONLY defined a surrogate key. i believe that even if using surrogate keys is the best solution for your design, you should ALSO define an alternate unique natural key.
So why MERGE?

I learned about the MERGE statement in 2008. During an interview, Frank Davis asked me about when I would use it. I didn't even know what it was (and admitted that) but I went home that night and...wait...I think he asked me about multi table inserts. Whatever, credit is still going to Mr. Davis. Where was I? OK, so I had been working with Oracle for about 6 years at that point and I didn't know about it. My initial reaction was to use it everywhere (not really)! You know, shiny object and all. Look! Squirrel!

Why am I considering MERGE a bug? Let me be more specific. I was working with a couple of tables and had not written the API for them yet and a developer was writing some PL/SQL to update the records from APEX. In his loop he had a MERGE. I realized at that moment there was 1, no surrogate key and 2, no natural key defined (which ties in with Kent's comments up above). Upon realizing the developer was doing this, I knew immediately what the problem was (besides not using a PL/SQL API to nicely encapsulate the business logic). The table was poorly designed.

Easy fix. Update the table with a surrogate key and define a natural key. I was thankful for the reminder, I hadn't added the unique constraint yet. Of course had I written the API already I probably would have noticed the design error, either way, a win for design.

Now, there are perfectly good occasions to use the MERGE statement. Most of those, to me anyway, relate to legacy systems where you don't have the ability to change the underlying table structures (or it's just cost prohibitive) or ETL, where you want to load/update a dimension table in your data warehouse.

Noons, how's that? First time out in 10 months. Thanks for the push.

Wednesday, June 4, 2014

Fun with SQL - Silver Pockets Full

Silver Pockets Full, send this message to your friends and in four days the money will surprise you. If you don't, well, a pox on your house. Or something like that. I didn't know what it was, I just saw this in my FB feed:



Back in November, I checked to see the frequency of having incremental numbers in the date, like 11/12/13 (my birthday) and 12/13/14 (kate's birthday). I don't want to hear how the rest of the world does their dates either, I know (I now write my dates like YYYY/MM/DD on everything, just so you know, that way I can sort it...or something).

Anyway, SQL to test out the claim of once every 823 years. Yay SQL.

OK, I'm not going to go into the steps necessary because I'm lazy (and I'm just lucky to be writing here), so here it is:
select *
from
  (
    select 
      to_char( d, 'yyyymm' ) year_month,
      count( case
               when to_char( d, 'fmDay' ) = 'Saturday' then 1
               else null
             end ) sats,
      count( case
               when to_char( d, 'fmDay' ) = 'Sunday' then 1
               else null
             end ) suns,
      count( case
               when to_char( d, 'fmDay' ) = 'Friday' then 1
               else null
             end ) fris
    from
      (
        select to_date( 20131231, 'yyyymmdd' ) + rownum d
        from dual
          connect by level <= 50000
      )
    group by 
      to_char( d, 'yyyymm' )
  )
where fris = 5
  and sats = 5
  and suns = 5
So over the next 50,000 days, this happens 138 times. I'm fairly certain that doesn't rise to the once every 823 years claim. But it's cool, maybe.
YEAR_MONTH       SATS       SUNS       FRIS
---------- ---------- ---------- ----------
201408              5          5          5 
201505              5          5          5 
201601              5          5          5 
201607              5          5          5 
201712              5          5          5 
128 more occurrences...
214607              5          5          5 
214712              5          5          5 
214803              5          5          5 
214908              5          5          5 
215005              5          5          5 

 138 rows selected 
I'm not the only dork that does this either, here's one in perl. I'm sure there are others, but again, I'm lazy.