Ankit Bansal
Ankit Bansal
  • Видео 192
  • Просмотров 4 485 594
Acies Global SQL Interview Question | Employee Swipe in and Swipe out
In this video we will solve a tricky SQL interview question asked in Acies Global interview.
You can now practice high quality SQL questions on Namaste SQL : www.namastesql.com/coding-problems
Script:
CREATE TABLE swipe (
employee_id INT,
activity_type VARCHAR(10),
activity_time datetime
);
-- Insert sample data
INSERT INTO swipe (employee_id, activity_type, activity_time) VALUES
(1, 'login', '2024-07-23 08:00:00'),
(1, 'logout', '2024-07-23 12:00:00'),
(1, 'login', '2024-07-23 13:00:00'),
(1, 'logout', '2024-07-23 17:00:00'),
(2, 'login', '2024-07-23 09:00:00'),
(2, 'logout', '2024-07-23 11:00:00'),
(2, 'login', '2024-07-23 12:00:00'),
(2, 'logout', '2024-07-23 15:00:00'),
(1, 'login', '2024-07-24 08:...
Просмотров: 2 911

Видео

American Express SQL Interview Question and Solution | Page Recommendation
Просмотров 7 тыс.День назад
In this video we will discuss a SQL interview question asked in American Express Data Analyst Interview. High Quality Analytics Courses : www.namastesql.com/ script: CREATE TABLE friends ( user_id INT, friend_id INT ); Insert data into friends table INSERT INTO friends VALUES (1, 2), (1, 3), (1, 4), (2, 1), (3, 1), (3, 4), (4, 1), (4, 3); Create likes table CREATE TABLE likes ( user_id INT, pag...
SQL Interview Question Asked in Tredence Analytics
Просмотров 10 тыс.14 дней назад
In this video we will solve a SQL interview question asked in Tredence Analytics . We will solve it using 3 methods. High quality in depth Analytics courses : www.namastesql.com/ script: CREATE TABLE cinema ( seat_id INT PRIMARY KEY, free int ); delete from cinema; INSERT INTO cinema (seat_id, free) VALUES (1, 1); INSERT INTO cinema (seat_id, free) VALUES (2, 0); INSERT INTO cinema (seat_id, fr...
SQL Test Based on Real Interview | SQL Interview Questions and Answers
Просмотров 21 тыс.21 день назад
In this video we will solve a complete SQL test consist on 7 interview questions. This would be a great exercise to practice SQL. Start your data analytics journey: www.namastesql.com/ script: CREATE TABLE users ( USER_ID INT PRIMARY KEY, USER_NAME VARCHAR(20) NOT NULL, USER_STATUS VARCHAR(20) NOT NULL ); CREATE TABLE logins ( USER_ID INT, LOGIN_TIMESTAMP DATETIME NOT NULL, SESSION_ID INT PRIMA...
Difference Between SQL Views vs Materialized Views | Frequently Asked SQL Interview Question
Просмотров 9 тыс.Месяц назад
In this video we will understand SQL views and difference Between SQL Views vs Materialized Views. This is a very frequently asked SQL interview question. This is the only video you need to answer this question in your next SQL interview. script: CREATE TABLE orders ( order_id INT, order_date DATE, product_name VARCHAR(20), sales INT ); INSERT INTO orders (order_id, order_date, product_name, sa...
Fractal Analytics SQL Interview Question (Game of Thrones Database) | SQL for Data Engineer
Просмотров 18 тыс.Месяц назад
In this video we will discuss an advanced interview question on SQL asked in a Data Engineer interview based on Game of Thrones Database. Kickoff your data analytics journey : www.namastesql.com/ Script: Create the 'king' table CREATE TABLE king ( k_no INT PRIMARY KEY, king VARCHAR(50), house VARCHAR(50) ); Create the 'battle' table CREATE TABLE battle ( battle_number INT PRIMARY KEY, name VARC...
Accenture SQL Interview Question | Database Case Sensitivity vs Insensitivity
Просмотров 20 тыс.Месяц назад
In this video we will discuss a Accenture SQL interview question and solve it when the data is case sensitive and when it is insensitivity . Here is the script: CREATE TABLE employees (employee_id int,employee_name varchar(15), email_id varchar(15) ); delete from employees; INSERT INTO employees (employee_id,employee_name, email_id) VALUES ('101','Liam Alton', 'li.al@abc.com'); INSERT INTO empl...
BCA to Data Analyst at Big Tech Product Based Company Ex-Amazon, PhonePe, Noon
Просмотров 4,3 тыс.Месяц назад
In this podcast with Vishal Sahu , We will discuss how he moved from BCA to data analytics at a ride sharing company and worked with big companies like Amazon, Noon, PhonePe etc. We discussed about following points : 1- Salaries in analytics domain 2- What skills are needed to move into data analytics 3- What triggered him to move to data analytics from L2 at Amazon. 4- How it feels to work in ...
15 Days of Learning SQL | Advanced SQL for Data Analytics
Просмотров 20 тыс.2 месяца назад
15 days of learning SQL In this video we will deep dive on a problem called "15 days of learning SQL" from hacker rank. This is a challenging SQL problem to solve. Here is the script: CREATE TABLE Submissions ( submission_date DATE, submission_id INT PRIMARY KEY, hacker_id INT, score INT ); INSERT INTO Submissions (submission_date, submission_id, hacker_id, score) VALUES ('2016-03-01', 8494, 20...
Netflix Data Cleaning and Analysis Project | End to End Data Engineering Project (SQL + Python)
Просмотров 31 тыс.2 месяца назад
In this video we will implement an end to end ELT project. ELT stands for Extract, Load and Transform . We will use Netflix dataset to clean and analyze the data using SQL and Python. LinkedIn: www.linkedin.com/in/ankitbansal6/ High quality Data Analytics affordable courses: www.namastesql.com/ End to End ETL project : ruclips.net/video/uL0-6kfiH3g/видео.html Netflix dataset: www.kaggle.com/dat...
Meesho SQL Interview Question for Data Analysts | SQL Non-Equi Join | Aam vs Mentos Zindagi
Просмотров 14 тыс.2 месяца назад
Meesho SQL Interview Question for Data Analysts | SQL Non-Equi Join | Aam vs Mentos Zindagi
3 Solutions to a ITC Infotech SQL Interview Question
Просмотров 10 тыс.2 месяца назад
In this video we will solve a ITC Infotech SQL interview question using 3 solutions. here is the script: CREATE TABLE city_distance ( distance INT, source VARCHAR(512), destination VARCHAR(512) ); delete from city_distance; INSERT INTO city_distance(distance, source, destination) VALUES ('100', 'New Delhi', 'Panipat'); INSERT INTO city_distance(distance, source, destination) VALUES ('200', 'Amb...
Angel One Easy-Peasy SQL Interview Question for a Data Science Position
Просмотров 11 тыс.2 месяца назад
In this video we will discuss a Angle One SQL interview question asked for a data science position. Kickoff your data analytics journey: www.namastesql.com/ Script: CREATE TABLE tickets ( airline_number VARCHAR(10), origin VARCHAR(3), destination VARCHAR(3), oneway_round CHAR(1), ticket_count INT ); INSERT INTO tickets (airline_number, origin, destination, oneway_round, ticket_count) VALUES ('D...
Swiggy Data Analyst SQL Interview Question and Answer
Просмотров 14 тыс.2 месяца назад
In this video we will discuss a sql interview question asked in Swiggy for a Data Analyst Position. Here is the script: Create the table CREATE TABLE stock ( supplier_id INT, product_id INT, stock_quantity INT, record_date DATE ); Insert the data delete from stock; INSERT INTO stock (supplier_id, product_id, stock_quantity, record_date) VALUES (1, 1, 60, '2022-01-01'), (1, 1, 40, '2022-01-02'),...
IBM Data Engineer SQL Interview Question (Hacker Rank Online Test)
Просмотров 19 тыс.2 месяца назад
In this video we are going to discuss a SQL interview question asked in IBM for a data engineer position. It was part of a hacker rank test. We are also going to tweak the question a bit and try to solve it. Kick off Your Data Analytics Journey: www.namastesql.com/ script: CREATE TABLE FAMILIES ( ID VARCHAR(50), NAME VARCHAR(50), FAMILY_SIZE INT ); Insert data into FAMILIES table INSERT INTO FA...
Honeywell SQL Interview Question | Print Movie Stars (⭐ ⭐ ⭐ ⭐⭐) For best movie in each Genre
Просмотров 12 тыс.3 месяца назад
Honeywell SQL Interview Question | Print Movie Stars (⭐ ⭐ ⭐ ⭐⭐) For best movie in each Genre
Cracked Myntra as Data Analyst with 1 Year Experience
Просмотров 16 тыс.3 месяца назад
Cracked Myntra as Data Analyst with 1 Year Experience
End to End Data Analytics Project (Python + SQL)
Просмотров 113 тыс.3 месяца назад
End to End Data Analytics Project (Python SQL)
SCD Type 1 and Type 2 using SQL | Implementation of Slowly Changing Dimensions
Просмотров 11 тыс.3 месяца назад
SCD Type 1 and Type 2 using SQL | Implementation of Slowly Changing Dimensions
SQL Merge Statement Tutorial A-Z | How to use Merge in SQL step by step
Просмотров 11 тыс.3 месяца назад
SQL Merge Statement Tutorial A-Z | How to use Merge in SQL step by step
SQL QUALIFY Keyword | Reduce Your Sub Queries and CTEs
Просмотров 9 тыс.3 месяца назад
SQL QUALIFY Keyword | Reduce Your Sub Queries and CTEs
Top 5 Advanced SQL Interview Questions and Answers | Frequently Asked SQL interview questions
Просмотров 41 тыс.3 месяца назад
Top 5 Advanced SQL Interview Questions and Answers | Frequently Asked SQL interview questions
Tricky SQL Interview Question by a Product Based Company | Ludo King SQL Analytics
Просмотров 8 тыс.4 месяца назад
Tricky SQL Interview Question by a Product Based Company | Ludo King SQL Analytics
Famous SQL Interview Question | First Name , Middle Name and Last Name of a Customer
Просмотров 25 тыс.4 месяца назад
Famous SQL Interview Question | First Name , Middle Name and Last Name of a Customer
Rollup, Cube and Grouping Sets in SQL | Advanced SQL Tutorial | Beyond Group By
Просмотров 16 тыс.4 месяца назад
Rollup, Cube and Grouping Sets in SQL | Advanced SQL Tutorial | Beyond Group By
SQL For Business Intelligence | YoY, QoQ, MTD, YTD, QTD etc. in a Single SQL
Просмотров 16 тыс.4 месяца назад
SQL For Business Intelligence | YoY, QoQ, MTD, YTD, QTD etc. in a Single SQL
Building Calendar Dimension Table from Scratch with SQL | SQL For Analytics
Просмотров 16 тыс.4 месяца назад
Building Calendar Dimension Table from Scratch with SQL | SQL For Analytics
Launching Live Bootcamp of SQL for Data Analytics 0 to Hero | Starting March 2nd 2024
Просмотров 9 тыс.4 месяца назад
Launching Live Bootcamp of SQL for Data Analytics 0 to Hero | Starting March 2nd 2024
Why I left Amazon.
Просмотров 13 тыс.5 месяцев назад
Why I left Amazon.
Superstore Data Analysis | End to End AWS Data Engineering Project for Beginners
Просмотров 24 тыс.5 месяцев назад
Superstore Data Analysis | End to End AWS Data Engineering Project for Beginners

Комментарии

  • @varunanr5433
    @varunanr5433 16 часов назад

    Here's my approach MySQL, with cte as (select *, date(activity_time) as date, lead(activity_time) over(partition by employee_id, date(activity_time) order by activity_time ) as nxt_time from swipe) select employee_id, date, SUM(TIME_TO_SEC(hrs)) / 3600 as Total_hrs, SUM(TIME_TO_SEC(Office_hrs)) / 3600 as Office_hrs from ( Select *, case when rn %2=0 then 0 else hrs end as Office_hrs from( select employee_id, date(activity_time) as date, timediff(nxt_time, activity_time) as hrs, row_number() over (partition by employee_id,date(activity_time)) as rn from cte where nxt_time is not null) as A ) B group by employee_id, date @ankit Please check and correct me if I'm wrong.

  • @dilipinamdar5523
    @dilipinamdar5523 18 часов назад

    Thank you🙏

  • @addhyasumitra90
    @addhyasumitra90 18 часов назад

    With all_matches as ( select team, SUM(matches_played) as total_matches_played from (select team_1 as team, COUNT(*) as matches_played from icc_world_cup_1 group by team_1 UNION all select team_2 as team, COUNT(*) as matches_played from icc_world_cup_1 group by team_2)a group by team), Winner as( select winner, COUNT(*) as wins from icc_world_cup_1 group by winner) Select team, total_matches_played, COALESCE(wins,0) as wins_match, total_matches_played-COALESCE(wins,0) as matches_lost from all_matches as a left join winner as w on A.team=w.winner order by wins_match desc;

  • @vaibhavverma1340
    @vaibhavverma1340 19 часов назад

    Simple Solution 😀 with cte as (select employee_id, activity_type, convert(date, activity_time) as dates, convert(time, activity_time) as times, case when activity_type = 'logout' then lag(convert(time, activity_time), 1) over (partition by employee_id, convert(date, activity_time) order by employee_id) else null end as prev_times from swipe) select employee_id, dates, (datepart(hh,max(times)) - datepart(hh,min(times))) as total_spent_hrs, sum((datepart(hh,times)) - datepart(hh,prev_times)) as productive_hrs from cte group by employee_id, dates

  • @mathman298
    @mathman298 19 часов назад

    --select * from swipe with cte as ( select * , activity_time::DATE as login_day, (activity_time_2 - activity_time) as time_inside from (select * , lead(activity_type , 1 ) over ( partition by employee_id order by activity_time ) as activity_type_2 , LEAD( activity_time ,1 ) over (partition by employee_id order by activity_time ) as activity_time_2 from swipe ) where activity_type = 'login' ) , info as ( select *, sum(time_inside) over(partition by employee_id,login_day order by login_day) as Time_spent ,count(1) over(partition by employee_id,login_day order by login_day) as swipes from cte ) select DISTINCT employee_id,login_day,time_spent from info select employee_id,login_day,max(swipes) from info group by employee_id ,login_day

  • @mathman298
    @mathman298 20 часов назад

    with cte as(select DISTINCT * from (select *, Lead(action ,1) over(partition by emp_id order by time) as second , Lead(time,1) over(partition by emp_id order by time) as out_time from hospital ) as a where action = 'in' ) select emp_id , action , time from cte where second is null

  • @mathman298
    @mathman298 20 часов назад

    with cte as(select DISTINCT * from (select *, Lead(action ,1) over(partition by emp_id order by time) as second , Lead(time,1) over(partition by emp_id order by time) as out_time from hospital ) as a where action = 'in' ) select count(emp_id) as 'total number of emp inside' from cte where second is null

  • @MuskanGoyal-db7cs
    @MuskanGoyal-db7cs 20 часов назад

    with cte as( select seat_id, free,lead(free,1)over(order by seat_id) as new from cinema) select seat_id ,free from cte where new=1 and free=1;

  • @anuragshrivastava-ez3mv
    @anuragshrivastava-ez3mv 21 час назад

    Mine is 7.5 times

  • @addhyasumitra90
    @addhyasumitra90 21 час назад

    Mind blowing!! initially when i listen the problem statement ,i thought i couldn't thought i would be able to understand the solution itself. but the way we break it down is truely awesome :)

  • @hariikrishnan
    @hariikrishnan 22 часа назад

    Partition employee_id, activity_type using row_number self join on login.row_number = logout.row_number hence we can join on 1st login with 1st logout group by and sum QUERY: with cte as (select *, row_number() over (partition by employee_id, activity_type order by employee_id, activity_time) as rn, activity_time :: date as datee from swipe) select inn.employee_id, inn.datee, sum(out.activity_time-inn.activity_time) from cte inn join cte out on inn.employee_id = out.employee_id and inn.rn = out.rn where inn.activity_type = 'login' group by inn.employee_id, inn.datee

  • @ankithashetty9133
    @ankithashetty9133 23 часа назад

    Incase if the rows are way to high how do we dynamically bring them as column names, is there a way to do it? It's not possible to give names individually . In my case I should bring dates as column names from the row.

  • @throughmyglasses9241
    @throughmyglasses9241 23 часа назад

    ----query 1 select employee_id,cast( activity_time as date) as date,max(activity_time)-min(activity_time) as time_spend_at_office from swipe group by employee_id,cast( activity_time as date) order by employee_id,cast( activity_time as date); --query 2 WITH CTE AS ( select employee_id,cast(activity_time as date) as date,activity_type, activity_time-LAG(activity_time) OVER(partition by employee_id,cast( activity_time as date) order by activity_time) AS time_spend_in_one_swipe from swipe) select employee_id,date,SUM(time_spend_in_one_swipe) as actual_time_spend from CTE where activity_type='logout' group by employee_id,date;

  • @vikhyatjalota2213
    @vikhyatjalota2213 День назад

    My SQL sol: with cte_1 as( select a.company_id ,a.user_id, case when a.language = 'English' and b.language = 'German' then 1 else 0 end as flag from company_users a join company_users b on a.company_id = b.company_id and a.user_id = b.user_id and a.language < b.language ) select company_id,sum(flag) as user_count from cte_1 group by 1 having sum(flag) >=2

  • @nd9267
    @nd9267 День назад

    ;with cte as ( select * ,lead(activity_type,1,activity_type) over(partition by employee_id, convert(date, activity_time) order by activity_time) lead_activity_type ,lead(activity_time,1,activity_time) over(partition by employee_id, convert(date, activity_time) order by activity_time) lead_activity_time ,convert(date, activity_time) activity_day ,datediff(hour, min(activity_time) over(partition by employee_id, datepart(day, activity_time)),max(activity_time) over(partition by employee_id, datepart(day, activity_time))) total_hours from swipe ) select c.employee_id ,activity_day ,total_hours ,sum(datediff(hour, activity_time, c.lead_activity_time)) productive_hours from cte c where activity_type = 'login' and lead_activity_type = 'logout' group by c.employee_id ,activity_day ,total_hours order by activity_day

  • @raushansingh7530
    @raushansingh7530 День назад

    SELECT emp_name FROM emp_manager e WHERE salary > ( SELECT salary FROM emp_manager WHERE emp_id = e.manager_id ); this is another approach

  • @raushansingh7530
    @raushansingh7530 День назад

    select emp.emp_name as Employee_name from emp_manager as emp inner join emp_manager as manager on emp.manager_id = manager.emp_id where emp.salary > manager.salary Thanks Ankit sr

  • @ajithshetty1684
    @ajithshetty1684 День назад

    with cte as( select *, case when lead(activity_type) over(partition by employee_id order by day(activity_time)) = 'logout' then datediff(hour,activity_time , lead(activity_time) over(partition by employee_id order by day(activity_time))) else null end as b from swipe ) select employee_id,cast(activity_time as date) as date,datediff(hour,min(activity_time),max(activity_time)) as login_time, sum(b) as productive_hrs from cte group by employee_id,cast(activity_time as date) order by employee_id

  • @likinponnanna8990
    @likinponnanna8990 День назад

    My solution in postgresql WITH DATA AS ( select * from ( SELECT *,ACTIVITY_TIME :: DATE AS ACTIVITY_DATE, LEAD(activity_time,1) over (partition by employee_id,activity_time::date order by activity_time) as logout_time FROM practise."swipe" ) where activity_type = 'login') SELECT EMPLOYEE_ID, ACTIVITY_DATE, ROUND(EXTRACT(EPOCH FROM (MAX(LOGOUT_TIME) - MIN(ACTIVITY_TIME)))/3600) AS OFFICE_HRS, SUM(ROUND(EXTRACT(EPOCH FROM (LOGOUT_TIME - ACTIVITY_TIME))/3600)) AS PRODUCTIVE_HRS FROM DATA GROUP BY EMPLOYEE_ID,ACTIVITY_DATE

  • @uday7291
    @uday7291 День назад

    Hi Ankit, I really appreciate your efforts. I learnt a lot from your videos and I've become confident in SQL Thanks a ton.

  • @myjourney352
    @myjourney352 День назад

    Thanks for sharing Ankit, I've learned so many things about SQL from you. Really grateful. with employee_activity as( select *, lead(activity_time) over(partition by employee_id, activity_time::date order by activity_time) log_out_time from swipe ) select employee_id, activity_time::date, extract(hour from max(log_out_time)-min(activity_time)) as total_working_hours, extract(hour from sum(log_out_time-activity_time)) as actual_working_hours from employee_activity where activity_type = 'login' group by 1,2

  • @imranpathan8352
    @imranpathan8352 День назад

    Hi really helpful logic Select* from production where sales>50 How to optimize queries above one This is interview question please guide me

    • @shekhar1523
      @shekhar1523 23 часа назад

      instead of * you can chose column names its reduce load on server

    • @piyushbamboriya1288
      @piyushbamboriya1288 22 часа назад

      Create index on sales column

  • @mithunmahato309
    @mithunmahato309 День назад

    Narayanmurthy's time pass activity to track employees.

  • @rohithr9122
    @rohithr9122 День назад

    Please check my solution in sql server select t1.employee_id,DATEdiff(hour,MIN(t1.activity_time),max(t2.activity_time)) total_hours, sum(DATEDIFF(hour,t1.activity_time,t2.activity_time)) as Inside_hours from ( select employee_id,activity_type,activity_time,ROW_NUMBER()over(partition by employee_id order by activity_time)rn from swipe where activity_type = 'login')t1 join ( select employee_id,activity_type,activity_time,ROW_NUMBER()over(partition by employee_id order by activity_time)rn from swipe where activity_type = 'logout')t2 on t1.employee_id = t2.employee_id and t1.rn = t2.rn group by t1.employee_id,day(t1.activity_time)

  • @prasadjawale2675
    @prasadjawale2675 День назад

    tried in sql developer create view minvalue as select distinct employee_id,min(activity_time) over (partition by to_char(activity_time,'dd'),employee_id)mintime from swipe where activity_type='login' order by employee_id create view maxvalue as select distinct employee_id,max(activity_time) over(partition by to_char(activity_time,'dd'),employee_id)maxtime from swipe where activity_type='logout' order by employee_id select m1.employee_id,m1.mintime,m2.maxtime, m2.maxtime - m1.mintime from minvalue m1 join maxvalue m2 on m1.employee_id=m2.employee_id and to_char(m1.mintime,'dd')=to_char(m2.maxtime,'dd') order by employee_id

  • @vanshhans5676
    @vanshhans5676 День назад

    Easy in dsa but difficult in sql😆. Here is my solution using self join - with cte1 as( select t1.customer,t1.start_loc from travel_data t1 left join travel_data t2 on t1.customer=t2.customer and t1.start_loc=t2.end_loc where t2.start_loc is null ) ,cte2 as ( select t1.customer,t1.end_loc from travel_data t1 left join travel_data t2 on t1.customer=t2.customer and t1.end_loc=t2.start_loc where t2.start_loc is null ) select cte1.*,cte2.end_loc from cte1 inner join cte2 on cte1.customer=cte2.customer

  • @minakshi_119
    @minakshi_119 День назад

    select a.*,b.total_hours from (with cte as (select *, cast (activity_time as date) as dates,lead(activity_time) over(partition by employee_id,cast(activity_time as date) order by activity_time) as logout from swipes ) select employee_id,dates,sum(extract(hour from logout) -extract(hour from activity_time) )as inside_hours from cte where activity_type='login' group by 1,2) a join ( with cte2 as (select employee_id,cast(activity_time as date),extract (hour from activity_time) from swipes group by 1,2,activity_time order by 1,2,3) select employee_id,activity_time as dates, max(extract)-min(extract) as total_hours from cte2 group by 1,2)b on a.employee_id=b.employee_id and a.dates=b.dates

  • @chinmayaareddy4792
    @chinmayaareddy4792 День назад

    with cte_1 as ( select *,ROW_NUMBER()over(partition by employee_id order by activity_time) as rk_out from swipe where activity_type='logout'), cte_2 as ( select *,ROW_NUMBER()over(partition by employee_id order by activity_time) as rk_in from swipe where activity_type='login') ,cte_3 as (select a.*,b.activity_type as login ,b.activity_time as login_time from cte_1 a inner join cte_2 b on a.employee_id=b.employee_id and a.rk_out=b.rk_in) ,cte_4 as (select employee_id ,cast(activity_time as date) as dt, DATEDIFF(hour,login_time,activity_time) as in_off from cte_3 ) select employee_id ,dt, sum (in_off) as in_time from cte_4 group by employee_id ,dt order by employee_id

  • @surajkurle6962
    @surajkurle6962 День назад

    thank u very easy

  • @arpanscreations6954
    @arpanscreations6954 День назад

    I faced this in Clairvoyant Data Analyst Interview 3 weeks back.

    • @mrx...223
      @mrx...223 День назад

      Are u fresher? And what is the intensity of other questions?? Are they tough compare to this one. And further to sql what other tools and technology question they ask you?

    • @arpanscreations6954
      @arpanscreations6954 День назад

      @@mrx...223 I'm having 4 YOE. other questions were medium level

    • @mrx...223
      @mrx...223 День назад

      @@arpanscreations6954 Excel, sql and what other skills they asked questions to you?

    • @arpanscreations6954
      @arpanscreations6954 21 час назад

      @@mrx...223 other than SQL they asked about pyspark, python.

  • @Shubhamjaiswal-ex2oc
    @Shubhamjaiswal-ex2oc День назад

    Thanks for the SQL session. Sharing other way for 2nd part of question using self join and group by: with tab as ( select s1.employee_id, s1.activity_time as login_tym, s2.activity_time as logout_tym from swipe s1 inner join swipe s2 on s1.employee_id=s2.employee_id and s1.activity_time<s2.activity_time and s2.activity_type<>s1.activity_type and s1.activity_type='login'), tab2 as ( select employee_id,login_tym,min(logout_tym) as log_oo from tab group by employee_id,login_tym) select employee_id, sum(datediff(hour,login_tym,log_oo)) as efficient_hr from tab2 group by employee_id; --output emp hr 1 12 2 6

  • @addhyasumitra90
    @addhyasumitra90 День назад

    after trying for hours i can solve the 1st part:(I am a new SQL learner. Need to improve myself ) with CTE as ( select employee_id,cast(activity_time as date) as activity_day, case when activity_type='login' then min(activity_time) end as login_time, case when activity_type='logout' then max(activity_time) end as logout_time from swipe group by cast(activity_time as date) ,employee_id, activity_type) select employee_id, activity_day,min(login_time) as login, max(logout_time) as logout, DATEDIFF(HOUR, min(login_time),max(logout_time)) as office_timings from CTE group by employee_id,activity_day;

  • @jonedrive7268
    @jonedrive7268 День назад

    with cte as (select * ,(case when revenue < lead(revenue, 1, revenue + 1) over(partition by company order by year) then 1 else 0 end) as flag from company_revenue) select company from company_revenue where company not in (select company from cte where flag = 0) group by company;

  • @Chathur732
    @Chathur732 День назад

    with cte as (select *, case when activity_type = 'login' then hour(lag_1) - hour(activity_time) end as hour_difference from ( select * , lead(activity_time) over(partition by employee_id order by activity_time) as lag_1 from swipe) as A where case when activity_type = 'login' then hour(lag_1) - hour(activity_time) end is not null), cte_2 as ( select employee_id, date(activity_time) as date_2, max(hour(LV)-hour(activity_time)) as diff_hr_2 from( select * , last_value(activity_time) over(partition by employee_id, date(activity_time)) as LV from swipe) as B group by employee_id, date(activity_time) ) select cte.employee_id, date(cte.activity_time) as date, diff_hr_2 as total_hours, sum(cte.hour_difference) as production_hours from cte join cte_2 on cte.employee_id = cte_2.employee_id and date(cte.activity_time) = cte_2.date_2 group by employee_id, date(activity_time), diff_hr_2 order by date(cte.activity_time) Did in MYSQL workbench. It is really good. first Cte was to calculate total hours. Cte_2 was to calculate production hours. Joined both CTE in the main query.

  • @muskhad100
    @muskhad100 День назад

    Hi Ankit, I have solved using below method,, I have used substring to get the number value from quarter and we know there are only 3 possibilities if we add those numbers and used them as condition in case statement select store, case when sum(cast(substring(quarter,2,1) as int)) = 7 then 'Q3' when sum(cast(substring(quarter,2,1) as int)) = 6 then 'Q4' when sum(cast(substring(quarter,2,1) as int)) = 8 then 'Q2' end as missing_quarter from stores group by store;

  • @pradeepvadlakonda007
    @pradeepvadlakonda007 День назад

    select * from swipe; WITH cte AS ( SELECT employee_id, activity_type, DATE(activity_time) AS date, TIME_TO_SEC(TIME(activity_time)) AS time_in_seconds FROM swipe ), time_diff AS ( SELECT employee_id, date, MAX(CASE WHEN activity_type = 'logout' THEN time_in_seconds END) - MIN(CASE WHEN activity_type = 'login' THEN time_in_seconds END) AS total_seconds, SUM(CASE WHEN activity_type = 'login' THEN time_in_seconds END) AS total_login_seconds, SUM(CASE WHEN activity_type = 'logout' THEN time_in_seconds END) AS total_logout_seconds FROM cte GROUP BY employee_id, date ) SELECT employee_id, date, ROUND(total_seconds / 3600) AS hours_worked, ROUND((total_logout_seconds - total_login_seconds) / 3600) AS productive_hours FROM time_diff ORDER BY date;

  • @iamanuj8942
    @iamanuj8942 День назад

    good one

  • @karnavsood9750
    @karnavsood9750 День назад

    in my microsoft sql management studio there is no option of sql server native client 11.0

  • @jonedrive7268
    @jonedrive7268 День назад

    Video quality is pathetic.

    • @ankitbansal6
      @ankitbansal6 День назад

      Change the video quality from RUclips video settings

  • @RaviKanth-fx1pt
    @RaviKanth-fx1pt День назад

    Thank you very much!

  • @vinitpatidar5617
    @vinitpatidar5617 День назад

    Most simple explanation. Thanks

  • @nagarajshenoy4848
    @nagarajshenoy4848 День назад

    This video is pretty simple to understand and this expertise will come with practice.. I hope all the folks practice this at least once a day.. thanks so much Ankit

  • @arjundev4908
    @arjundev4908 День назад

    /***Total hours***/ with cte as(SELECT *, row_NUMBER()OVER(partition by employee_id,extract(day from activity_time) order by activity_time) as rw from swipe) select employee_id,Date(activity_time) as Dates, round((max(activity_time) - min(activity_time))/10000,0) as Total_hrs from cte group by 1,2 order by 2; /***Actual hours***/ Select * from swipe; WITH CTE AS(Select employee_id, case when activity_type = 'login' then activity_time END as login, case when activity_type = 'logout' then activity_time END as logout from swipe),V1 AS( SELECT *, LEAD(LOGOUT)OVER(partition by employee_id)AS NXT FROM CTE),V2 AS( SELECT *, ROUND((NXT-LOGIN))/10000 AS DIFFERENCE FROM V1 WHERE LOGIN IS NOT NULL) SELECT employee_id,DATE(login) AS DATES,ROUND(SUM(DIFFERENCE),0)AS TotalHrs FROM V2 group by 1,2;

  • @varunas9784
    @varunas9784 День назад

    Good one as always Ankit! Here's my approach: =================================== with actual_working_hrs as (select employee_id, DATETRUNC(DAY, activity_time) [login date], SUM([status hours]) [actual login hours] from (select *, DATEDIFF(HOUR, activity_time,LEAD(activity_time, 1) over(partition by employee_id, DAY(activity_time), MONTH(activity_time) order by activity_time)) [status hours] from swipe) s1 where activity_type = 'login' group by employee_id, DATETRUNC(DAY, activity_time)), net_login as (select Employee_id, DATETRUNC(DAY, activity_time) [login date], DATEDIFF(HOUR, MIN(activity_time), MAX(activity_time)) [net login hours] from swipe group by Employee_id, DATETRUNC(DAY, activity_time)) select a.*, [net login hours] from actual_working_hrs a join net_login n on a.employee_id = n.employee_id and a.[login date] = n.[login date] ===================================

  • @rakeshranjan7647
    @rakeshranjan7647 День назад

    My solution with cte as ( select department_id,AVG(salary) as Avgsal, (select Company_sal=AVG(salary) from emp e where e.department_id<>e1.department_id ) Company_sal from emp e1 group by department_id) select * from cte where avgsal<Company_sal

  • @ankitbansal6
    @ankitbansal6 День назад

    Excited to share that now you can practice sql interview problems on namaste sql 😍😍: www.namastesql.com/coding-problems

  • @harshithapolisetty3501
    @harshithapolisetty3501 День назад

    with cte as(select order_id,customer_id,order_date,order_amount, row_number() over (partition by customer_id order by order_Date) as rk from customer_orders ) select order_date, sum(case when rk=1 then 1 else 0 end) as new, sum(case when rk>1 then 1 else 0 end) as repeat from cte group by order_date

  • @ayushibhawsar8699
    @ayushibhawsar8699 День назад

    with cte as ( select user_id,sum(case when filter_room_types like '%entire%' then 1 else 0 end) as entire, sum(case when filter_room_types like '%shared%' then 1 else 0 end) as shared, sum(case when filter_room_types like '%private%' then 1 else 0 end) as private from airbnb_searches group by user_id) select 'Entire' as value, sum(entire) as no_of_searches from cte union all select 'Shared' as value,sum(shared) as no_of_searches from cte union all select 'Private' as value,sum(private) as no_of_searches from cte order by no_of_searches desc