Тёмный

SQL Interview Question - Solution (Part - XVII) |  

MeanLifeStudies
Подписаться 941
Просмотров 689
50% 1

#sql #education #sqlfunctions #dataanalyst #dataengineers
#MeanLifeStudies #sqlinterview #datascience #interview #dataanalystinterview
Here are My profiles that will definitely help your preparation for data analyst or data engineer roles.
Medium: / mahendraee204
Github: github.com/mahendra204
Here are table create and insert statements:
----------------------------------------------------------------------
create table sales_info (
TransactionID int primary key,
Date date,
CustomerID int,
CustomerName varchar(20),
ProductID varchar(20),
Quantity INT,
SalesAmount float,
StoreID varchar(20),
StoreName varchar(20)
)
INSERT INTO Sales_Info values (1, '2023-01-01', 1000, 'Mahendra', 'P1', 2, 20, 'S1', 'Store A'),
(2, '2023-01-02', 1003, 'Mahi', 'P2', 1, 50, 'S2', 'Store B'),
(3, '2023-01-03', 1008, 'Ananya', 'P3', 3, 45, 'S3', 'Store C'),
(4, '2023-01-04', 2004, 'Rohit', 'P2', 1, 50, 'S1', 'Store A'),
(5, '2023-01-05', 3009, 'Priya', 'P1', 2, 20, 'S2', 'Store B'),
(6, '2023-01-06', 1000, 'Mahendra', 'P3', 4, 60, 'S3', 'Store C'),
(7, '2023-01-07', 1003, 'Mahi', 'P2', 2, 100, 'S1', 'Store A'),
(8, '2023-01-08', 1008, 'Ananya', 'P3', 1, 15, 'S2', 'Store B'),
(9, '2023-01-09', 2004, 'Rohit', 'P1', 1, 120, 'S3', 'Store C'),
(10, '2023-01-10', 3009, 'Priya', 'P2', 3, 150, 'S1', 'Store A'),
(11, '2023-01-11', 1000, 'Mahendra', 'P1', 1, 510, 'S2', 'Store B'),
(12, '2023-01-12', 1003, 'Mahi', 'P3', 2, 300, 'S3', 'Store C'),
(13, '2023-01-13', 1008, 'Ananya', 'P2', 1, 580, 'S1', 'Store A'),
(14, '2023-01-14', 2004, 'Rohit', 'P1', 2, 200, 'S2', 'Store B'),
(15, '2023-01-15', 3009, 'Priya', 'P3', 1, 180, 'S3', 'Store C'),
(16, '2023-01-16', 1000, 'Mahendra', 'P2', 3, 950, 'S1', 'Store A'),
(17, '2023-01-17', 1003, 'Mahi', 'P1', 1, 100, 'S2', 'Store B'),
(18, '2023-01-18', 1008, 'Ananya', 'P3', 2, 300, 'S3', 'Store C'),
(19, '2023-01-19', 2004, 'Rohit', 'P2', 1, 500, 'S1', 'Store A'),
(20, '2023-01-20', 3009, 'Priya', 'P1', 3, 350, 'S2', 'Store B');

Опубликовано:

 

5 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 6   
@bankimdas9517
@bankimdas9517 21 день назад
Thanks for making this video. Please bring more questions on data analysis topic.
@MeanLifeStudies
@MeanLifeStudies 21 день назад
Definitely. Thanks for supporting.
@anilkumark3573
@anilkumark3573 18 дней назад
My solution: with cteone as ( select storeid, Salesamount as sales, date, datepart(week, date) as week, dense_rank() over(partition by storeid order by datepart(week, date) asc) as first_week, dense_rank() over(partition by storeid order by datepart(week, date) desc) as last_week from sales_info ), ctwo as ( select storeid, Sum(case when first_week = 1 then sales end) as first_amount, Sum(case when last_week = 1 then sales end) as last_amount from cteone group by storeid ) select storeid, round(((last_amount - first_amount) / first_amount) * 100, 2) as perc_increase from ctwo;
@MeanLifeStudies
@MeanLifeStudies 18 дней назад
Kindly check your solution once. Why are you considering the maximum sales amount from each week? The question is to find the total sales increase percentage from the first to last week, we need to add all sales in the first week for each store, and we need to find the total sales in the last week. and then find the percentage increment right?
@anilkumark3573
@anilkumark3573 18 дней назад
@@MeanLifeStudies Agree with you, It should be sum, correction done.
Далее
EVOLUTION OF ICE CREAM 😱 #shorts
00:11
Просмотров 6 млн
NOOOOO 😂😂😂
00:15
Просмотров 4 млн
EVOLUTION OF ICE CREAM 😱 #shorts
00:11
Просмотров 6 млн