Avatar
Home
Projects
Contact
Available For Work
HomeProjectsContact
Local Time (Africa/Juba)
Local Time (Africa/Juba)
HomeProjectsContact
©2025, All Rights ReservedBuilt with by Maged

Scrolling Through Pixels: The Mogz Visuals Website Saga

A journey through parallax effects, smooth scrolling, and creative problem-solving as we craft a cutting-edge digital presence for Juba's premier media studio.

Jul, 2024Completedmogz.studioSource
Carousel image (1)
Carousel image (2)
Carousel image (3)
Carousel image (4)

Table of Contents

  • The Client and the Vision
  • The Technical Challenge: Smooth Scrolling in Next.js
  • Snippet: scrollContext.tsx
  • Snippet: LocomotiveScrollSection.tsx
  • The Solution: Secure Private Galleries
  • Snippet: useAutoDeleteCookie.ts
  • Snippet: route.ts
  • Snippet: middleware.ts
  • The Feature: On-the-Fly Collection Downloads

The Client and the Vision

In the heart of Juba's thriving media scene, Mogz Visuals has a stellar reputation for high-end photography. Led by founder Jacob Mogga Kei, their work is exceptional, but their online presence didn't yet match the quality of their portfolio. They approached me to build a website that would truly capture the essence of their brand.

The vision was clear. Jacob needed a site that could:

  • Showcase their work in a visually stunning, immersive way.
  • Provide a seamless client experience for accessing and sharing photos.
  • Offer a secure system for private client collections.
  • Include an intuitive download feature for entire photo sets.

This project would become a deep dive into advanced scroll mechanics, secure authentication flows, and on-the-fly file compression, all in service of delivering a cutting-edge digital experience.

The Technical Challenge: Smooth Scrolling in Next.js

To create the immersive feel the client wanted, I decided against a standard portfolio layout. Instead, I opted for a more dynamic experience using parallax effects and smooth scrolling, inspired by the creative implementations on sites like Codrops. This led me to Locomotive Scroll, a powerful library for creating silky-smooth scroll effects.

However, integrating it into a modern Next.js 13 project presented a significant architectural challenge. Locomotive Scroll is a client-side library that wants to take full control of the page's scroll container. This directly conflicts with Next.js's modern App Router, which is designed around server components that render independently of the client-side environment.

The solution was to architect a system that could isolate the client-side library without breaking the server-first paradigm of Next.js. I accomplished this using a React Context Provider.

This ScrollProvider acts as a boundary. It wraps the parts of the application that need smooth scrolling and uses a hook to initialize Locomotive Scroll on the client side. This keeps the server components completely unaware of the library's existence, resolving the core conflict. The provider then uses React Context to pass the scroll instance and its data (like scroll position) down to any child component that needs it.

Share Project

Scrolling Through Pixels: The Mogz Visuals Website Saga

A journey through parallax effects, smooth scrolling, and creative problem-solving as we craft a cutting-edge digital presence for Juba's premier media studio.

Jul, 2024Completedmogz.studioSource
Carousel image (1)
Carousel image (2)
Carousel image (3)
Carousel image (4)

Table of Contents

  • The Client and the Vision
  • The Technical Challenge: Smooth Scrolling in Next.js
  • Snippet: scrollContext.tsx
  • Snippet: LocomotiveScrollSection.tsx
  • The Solution: Secure Private Galleries
  • Snippet: useAutoDeleteCookie.ts
  • Snippet: route.ts
  • Snippet: middleware.ts
  • The Feature: On-the-Fly Collection Downloads

The Client and the Vision

In the heart of Juba's thriving media scene, Mogz Visuals has a stellar reputation for high-end photography. Led by founder Jacob Mogga Kei, their work is exceptional, but their online presence didn't yet match the quality of their portfolio. They approached me to build a website that would truly capture the essence of their brand.

The vision was clear. Jacob needed a site that could:

  • Showcase their work in a visually stunning, immersive way.
  • Provide a seamless client experience for accessing and sharing photos.
  • Offer a secure system for private client collections.
  • Include an intuitive download feature for entire photo sets.

This project would become a deep dive into advanced scroll mechanics, secure authentication flows, and on-the-fly file compression, all in service of delivering a cutting-edge digital experience.

The Technical Challenge: Smooth Scrolling in Next.js

To create the immersive feel the client wanted, I decided against a standard portfolio layout. Instead, I opted for a more dynamic experience using parallax effects and smooth scrolling, inspired by the creative implementations on sites like Codrops. This led me to Locomotive Scroll, a powerful library for creating silky-smooth scroll effects.

However, integrating it into a modern Next.js 13 project presented a significant architectural challenge. Locomotive Scroll is a client-side library that wants to take full control of the page's scroll container. This directly conflicts with Next.js's modern App Router, which is designed around server components that render independently of the client-side environment.

The solution was to architect a system that could isolate the client-side library without breaking the server-first paradigm of Next.js. I accomplished this using a React Context Provider.

This ScrollProvider acts as a boundary. It wraps the parts of the application that need smooth scrolling and uses a hook to initialize Locomotive Scroll on the client side. This keeps the server components completely unaware of the library's existence, resolving the core conflict. The provider then uses React Context to pass the scroll instance and its data (like scroll position) down to any child component that needs it.

Share Project

  • Snippet: useDownloadCollection.ts
  • Phase 2: The Reality Check
  • 1. Off-Main-Thread Processing with Web Workers
  • Snippet: zip.worker.ts
  • Snippet: useDownloadCollection.ts
  • 2. Performance: Infinite Scroll & Caching
  • Snippet: useInfiniteScroll.ts
  • Snippet: route.ts
  • 3. Reducing Friction in Authentication
  • Lessons Learned
  • Final Thoughts
  • useEffect
    only

    To make implementation easier, I created a simple wrapper component that applies the necessary data-scroll-section attribute, allowing me to designate which parts of the page should be controlled by the scroll library.

    The Solution: Secure Private Galleries

    A critical requirement for Mogz Visuals was a secure portal for clients to view their private photo collections. The system needed to be robust and trustworthy. I engineered a solution using encrypted, auto-expiring session cookies and Next.js middleware.

    The authentication flow works like this:

    • Verification: A client enters their collection ID and password into a form. This data is sent to a server-side API route.
    • Encryption: The API route checks the credentials against the data stored in the Sanity CMS. If they match, it uses CryptoJS to encrypt the unique collection slug and sets it in a secure, HTTP-only cookie with a one-hour expiry.
    • Middleware Protection: A Next.js middleware file is configured to protect all routes under "/private/*". On every request to a private page, the middleware decrypts the cookie and verifies that its slug matches the slug of the requested page. If the cookie is invalid or absent, the user is immediately redirected away from the private content.

    This creates a secure, temporary session for clients without requiring a full user account system. To ensure the session ends properly, a custom hook also removes the cookie when the user closes their browser tab or after the one-hour timer expires.

    The Feature: On-the-Fly Collection Downloads

    To complete the client workflow, I built a feature allowing users to download an entire collection of images as a single zip file. This entire process is handled on the client-side to avoid server load.

    I created a custom hook, useDownloadCollection, that performs several actions:

    • Fetches Images: It takes a list of image URLs and fetches each one as a blob.
    • Zips in Memory: Using the JSZip library, it creates a new zip archive in the browser's memory and adds each image blob to it.
    • Triggers Download: Once all images are added, it generates the final zip file and uses the file-saver library to prompt the user to download it.

    The hook also integrates a simple API-based rate limiter to prevent abuse and adds the user's email to a marketing audience in Resend, helping the client build their mailing list.

    Update: While this architecture worked perfectly for standard portfolios, we quickly learned that "standard" is a dangerous assumption. When the client uploaded a massive wedding collection containing over 850 high-res images, the browser’s main thread choked, and the user experience crumbled. This led to a necessary re-architecture in Phase 2.

    Phase 2: The Reality Check

    Launching a site is rarely the end of the story; it’s usually just the start of the conversation. After Mogz Visuals went live, real-world usage exposed edge cases I hadn't anticipated. The system wasn't just handling 500-image portfolios; it was being hit with 850+ image collections, causing browser freezes and network timeouts.

    Here is how I refactored the core features to handle scale and reduce friction.

    1. Off-Main-Thread Processing with Web Workers

    The initial client-side zip solution was blocking the main thread. To solve this, I moved the heavy lifting to a Web Worker. This allows the zipping process to run in the background without locking up the UI.

    I also realized that downloading large collections in a single go was a recipe for failure on unstable networks. I pivoted to a partitioned download strategy: splitting the collection into chunks of 100 images.

    Here is how the zip.worker.ts handles the image fetching and compression in isolation:

    In the React hook, I calculate the necessary segments based on the total image count. This gives the user the flexibility to download specific parts or the whole collection in manageable sequential batches.

    2. Performance: Infinite Scroll & Caching

    Another issue with large collections was the initial load time. Loading 800 images at once was increasing page load time and broken thumbnails for images deep down the page. I realized most users never scroll to the very bottom, so loading everything upfront was wasteful.

    I implemented an Infinite Scroll solution that loads images in batches of 20. Crucially, I integrated this with an API route to cache the results, ensuring that subsequent requests for the same segment are instant.

    I hooked into the existing scrollContext (Locomotive Scroll) to detect when the user reaches the trigger point:

    3. Reducing Friction in Authentication

    While the middleware security was robust, the UX was clunky. Clients found it unintuitive to manually enter Collection IDs and passwords every time.

    I simplified the flow by introducing direct access links:

    https://www.mogz.studio/private?id={uniqueId}

    When a user clicks this link, the ID is pre-filled in the modal. We still maintain the strict security of the auto-expiring cookie, but the entry barrier is significantly lower. It’s a prime example of how security doesn't have to come at the expense of usability.

    Image(1)

    Lessons Learned

    This project was a fantastic learning experience that pushed me to solve several complex, real-world problems. The journey from concept to completion was a deep dive into modern web development practices, and my toolkit is considerably larger for it. Key takeaways include:

    • Integrating Third-Party Libraries: I learned the intricacies of working with DOM-heavy, client-side libraries like Locomotive Scroll within a server-component framework like Next.js. This required creating architectural boundaries with React Context to ensure both parts of the app could function without conflict.
    • Secure Authentication Flows: Building the private gallery system was a practical lesson in security. I mastered the use of encrypted session cookies and middleware to protect routes and manage temporary user access in a secure, stateless way.
    • Client-Side File Manipulation: The download feature was an opportunity to work with file compression and download handling directly in the browser. Using libraries like JSZip to process files on the client-side is a powerful technique for creating performant features that don't overload the server.
    • Resilience Over Idealism: Phase 2 taught me that assuming a "happy path" (good network, reasonable file sizes) is dangerous. Moving logic to Web Workers and implementing chunking wasn't just an optimization; it was a necessary survival step for the application in a production environment.

    Final Thoughts

    The Mogz Visuals website is more than just a portfolio; it's a digital experience designed to capture the essence of their artistry. The final platform successfully delivered on the client's vision, providing a visually stunning showcase for their work and a secure, seamless portal for their clients.

    This project stands as a testament to what can be achieved when cutting-edge web technologies are combined with creative vision and persistence. For Mogz Visuals, it’s a digital home that truly reflects their artistic prowess and positions them for continued success in Juba’s vibrant media scene.

  • Snippet: useDownloadCollection.ts
  • Phase 2: The Reality Check
  • 1. Off-Main-Thread Processing with Web Workers
  • Snippet: zip.worker.ts
  • Snippet: useDownloadCollection.ts
  • 2. Performance: Infinite Scroll & Caching
  • Snippet: useInfiniteScroll.ts
  • Snippet: route.ts
  • 3. Reducing Friction in Authentication
  • Lessons Learned
  • Final Thoughts
  • useEffect
    only

    To make implementation easier, I created a simple wrapper component that applies the necessary data-scroll-section attribute, allowing me to designate which parts of the page should be controlled by the scroll library.

    The Solution: Secure Private Galleries

    A critical requirement for Mogz Visuals was a secure portal for clients to view their private photo collections. The system needed to be robust and trustworthy. I engineered a solution using encrypted, auto-expiring session cookies and Next.js middleware.

    The authentication flow works like this:

    • Verification: A client enters their collection ID and password into a form. This data is sent to a server-side API route.
    • Encryption: The API route checks the credentials against the data stored in the Sanity CMS. If they match, it uses CryptoJS to encrypt the unique collection slug and sets it in a secure, HTTP-only cookie with a one-hour expiry.
    • Middleware Protection: A Next.js middleware file is configured to protect all routes under "/private/*". On every request to a private page, the middleware decrypts the cookie and verifies that its slug matches the slug of the requested page. If the cookie is invalid or absent, the user is immediately redirected away from the private content.

    This creates a secure, temporary session for clients without requiring a full user account system. To ensure the session ends properly, a custom hook also removes the cookie when the user closes their browser tab or after the one-hour timer expires.

    The Feature: On-the-Fly Collection Downloads

    To complete the client workflow, I built a feature allowing users to download an entire collection of images as a single zip file. This entire process is handled on the client-side to avoid server load.

    I created a custom hook, useDownloadCollection, that performs several actions:

    • Fetches Images: It takes a list of image URLs and fetches each one as a blob.
    • Zips in Memory: Using the JSZip library, it creates a new zip archive in the browser's memory and adds each image blob to it.
    • Triggers Download: Once all images are added, it generates the final zip file and uses the file-saver library to prompt the user to download it.

    The hook also integrates a simple API-based rate limiter to prevent abuse and adds the user's email to a marketing audience in Resend, helping the client build their mailing list.

    Update: While this architecture worked perfectly for standard portfolios, we quickly learned that "standard" is a dangerous assumption. When the client uploaded a massive wedding collection containing over 850 high-res images, the browser’s main thread choked, and the user experience crumbled. This led to a necessary re-architecture in Phase 2.

    Phase 2: The Reality Check

    Launching a site is rarely the end of the story; it’s usually just the start of the conversation. After Mogz Visuals went live, real-world usage exposed edge cases I hadn't anticipated. The system wasn't just handling 500-image portfolios; it was being hit with 850+ image collections, causing browser freezes and network timeouts.

    Here is how I refactored the core features to handle scale and reduce friction.

    1. Off-Main-Thread Processing with Web Workers

    The initial client-side zip solution was blocking the main thread. To solve this, I moved the heavy lifting to a Web Worker. This allows the zipping process to run in the background without locking up the UI.

    I also realized that downloading large collections in a single go was a recipe for failure on unstable networks. I pivoted to a partitioned download strategy: splitting the collection into chunks of 100 images.

    Here is how the zip.worker.ts handles the image fetching and compression in isolation:

    In the React hook, I calculate the necessary segments based on the total image count. This gives the user the flexibility to download specific parts or the whole collection in manageable sequential batches.

    2. Performance: Infinite Scroll & Caching

    Another issue with large collections was the initial load time. Loading 800 images at once was increasing page load time and broken thumbnails for images deep down the page. I realized most users never scroll to the very bottom, so loading everything upfront was wasteful.

    I implemented an Infinite Scroll solution that loads images in batches of 20. Crucially, I integrated this with an API route to cache the results, ensuring that subsequent requests for the same segment are instant.

    I hooked into the existing scrollContext (Locomotive Scroll) to detect when the user reaches the trigger point:

    3. Reducing Friction in Authentication

    While the middleware security was robust, the UX was clunky. Clients found it unintuitive to manually enter Collection IDs and passwords every time.

    I simplified the flow by introducing direct access links:

    https://www.mogz.studio/private?id={uniqueId}

    When a user clicks this link, the ID is pre-filled in the modal. We still maintain the strict security of the auto-expiring cookie, but the entry barrier is significantly lower. It’s a prime example of how security doesn't have to come at the expense of usability.

    Image(1)

    Lessons Learned

    This project was a fantastic learning experience that pushed me to solve several complex, real-world problems. The journey from concept to completion was a deep dive into modern web development practices, and my toolkit is considerably larger for it. Key takeaways include:

    • Integrating Third-Party Libraries: I learned the intricacies of working with DOM-heavy, client-side libraries like Locomotive Scroll within a server-component framework like Next.js. This required creating architectural boundaries with React Context to ensure both parts of the app could function without conflict.
    • Secure Authentication Flows: Building the private gallery system was a practical lesson in security. I mastered the use of encrypted session cookies and middleware to protect routes and manage temporary user access in a secure, stateless way.
    • Client-Side File Manipulation: The download feature was an opportunity to work with file compression and download handling directly in the browser. Using libraries like JSZip to process files on the client-side is a powerful technique for creating performant features that don't overload the server.
    • Resilience Over Idealism: Phase 2 taught me that assuming a "happy path" (good network, reasonable file sizes) is dangerous. Moving logic to Web Workers and implementing chunking wasn't just an optimization; it was a necessary survival step for the application in a production environment.

    Final Thoughts

    The Mogz Visuals website is more than just a portfolio; it's a digital experience designed to capture the essence of their artistry. The final platform successfully delivered on the client's vision, providing a visually stunning showcase for their work and a secure, seamless portal for their clients.

    This project stands as a testament to what can be achieved when cutting-edge web technologies are combined with creative vision and persistence. For Mogz Visuals, it’s a digital home that truly reflects their artistic prowess and positions them for continued success in Juba’s vibrant media scene.

    scrollContext.tsx

    'use client';
    import { usePathname } from 'next/navigation';
    import {
      ReactNode,
      createContext,
      useContext,
      useEffect,
      useRef,
      useState,
    } from 'react';
    
    type ScrollContextValue = {
      scrollInstance: LocomotiveScroll | null;
      scrollToSection: (id: string) => void;
    };
    
    const ScrollContext = createContext<ScrollContextValue | null>(null);
    
    export const useScroll = (): ScrollContextValue => {
      const context = useContext(ScrollContext);
      if (!context) {
        throw new Error('useScroll must be used within a ScrollProvider');
      }
      return context;
    };
    
    export const ScrollProvider = ({ children }: { children: ReactNode }) => {
      const scrollRef = useRef<LocomotiveScroll | null>(null);
      const [scrollInstance, setScrollInstance] =
        useState<LocomotiveScroll | null>(null);
      const pathname = usePathname();
    
      useEffect(() => {
        const initializeScroll = async () => {
          if (scrollRef.current) {
            scrollRef.current.destroy();
          }
    
          const LocomotiveScroll = (await import('locomotive-scroll')).default;
          const scroll = new LocomotiveScroll({
            el: document.querySelector('[data-scroll-container]') as HTMLElement,
            lerp: 0.05,
            smooth: true,
            reloadOnContextChange: true,
            smartphone: { smooth: true },
            touchMultiplier: 3,
          });
    
          scrollRef.current = scroll;
          setScrollInstance(scroll);
        };
    
        initializeScroll();
    
        return () => {
          if (scrollRef.current) {
            scrollRef.current.destroy();
            scrollRef.current = null;
            setScrollInstance(null);
          }
        };
        // eslint-disable-next-line react-hooks/exhaustive-deps
      }, [pathname]);
    
      const scrollToSection = (id: string) => {
        if (scrollRef.current) {
          scrollRef.current.scrollTo(id);
        }
      };
    
      return (
        <ScrollContext.Provider
          value={{
            scrollToSection,
            scrollInstance,
          }}
        >
          {children}
        </ScrollContext.Provider>
      );
    };

    LocomotiveScrollSection.tsx

    import { ReactNode } from 'react';
    import { Tag } from '@/lib/types';
    
    // Define the allowed HTML tags for the component
    type SectionTags = Extract<Tag, 'section' | 'div' | 'footer'>;
    
    type LocomotiveScrollWrapperProps = {
      children: ReactNode;  
      Tag?: SectionTags;  
      className?: string;  
      [x: string]: any;  
    };
    
    // Component to wrap content with LocomotiveScroll and dynamic HTML tags
    const LocomotiveScrollSection = ({
      children,
      className,
      Tag = 'section',  // Default tag is 'section'
      ...rest
    }: LocomotiveScrollWrapperProps) => {
      return (
        <Tag
          data-scroll-section 
          className={`overflow-hidden ${className}`} 
          {...rest}  // Spread any additional props
        >
          {children}  
        </Tag>
      );
    };
    
    export default LocomotiveScrollSection;

    scrollContext.tsx

    'use client';
    import { usePathname } from 'next/navigation';
    import {
      ReactNode,
      createContext,
      useContext,
      useEffect,
      useRef,
      useState,
    } from 'react';
    
    type ScrollContextValue = {
      scrollInstance: LocomotiveScroll | null;
      scrollToSection: (id: string) => void;
    };
    
    const ScrollContext = createContext<ScrollContextValue | null>(null);
    
    export const useScroll = (): ScrollContextValue => {
      const context = useContext(ScrollContext);
      if (!context) {
        throw new Error('useScroll must be used within a ScrollProvider');
      }
      return context;
    };
    
    export const ScrollProvider = ({ children }: { children: ReactNode }) => {
      const scrollRef = useRef<LocomotiveScroll | null>(null);
      const [scrollInstance, setScrollInstance] =
        useState<LocomotiveScroll | null>(null);
      const pathname = usePathname();
    
      useEffect(() => {
        const initializeScroll = async () => {
          if (scrollRef.current) {
            scrollRef.current.destroy();
          }
    
          const LocomotiveScroll = (await import('locomotive-scroll')).default;
          const scroll = new LocomotiveScroll({
            el: document.querySelector('[data-scroll-container]') as HTMLElement,
            lerp: 0.05,
            smooth: true,
            reloadOnContextChange: true,
            smartphone: { smooth: true },
            touchMultiplier: 3,
          });
    
          scrollRef.current = scroll;
          setScrollInstance(scroll);
        };
    
        initializeScroll();
    
        return () => {
          if (scrollRef.current) {
            scrollRef.current.destroy();
            scrollRef.current = null;
            setScrollInstance(null);
          }
        };
        // eslint-disable-next-line react-hooks/exhaustive-deps
      }, [pathname]);
    
      const scrollToSection = (id: string) => {
        if (scrollRef.current) {
          scrollRef.current.scrollTo(id);
        }
      };
    
      return (
        <ScrollContext.Provider
          value={{
            scrollToSection,
            scrollInstance,
          }}
        >
          {children}
        </ScrollContext.Provider>
      );
    };

    LocomotiveScrollSection.tsx

    import { ReactNode } from 'react';
    import { Tag } from '@/lib/types';
    
    // Define the allowed HTML tags for the component
    type SectionTags = Extract<Tag, 'section' | 'div' | 'footer'>;
    
    type LocomotiveScrollWrapperProps = {
      children: ReactNode;  
      Tag?: SectionTags;  
      className?: string;  
      [x: string]: any;  
    };
    
    // Component to wrap content with LocomotiveScroll and dynamic HTML tags
    const LocomotiveScrollSection = ({
      children,
      className,
      Tag = 'section',  // Default tag is 'section'
      ...rest
    }: LocomotiveScrollWrapperProps) => {
      return (
        <Tag
          data-scroll-section 
          className={`overflow-hidden ${className}`} 
          {...rest}  // Spread any additional props
        >
          {children}  
        </Tag>
      );
    };
    
    export default LocomotiveScrollSection;

    useDownloadCollection.ts

    import JSZip from 'jszip';
    import { useState } from 'react';
    import { saveAs } from 'file-saver';
    import { useToast } from '../context/ToastContext';
    import { fetchSanityData } from '../sanity/client';
    import { getPrivateCollectionGallery } from '../sanity/queries';
    import { COLLECTION } from '../types';
    
    const useDownloadCollection = ({ title, uniqueId, gallery }: COLLECTION) => {
      const [loading, setLoading] = useState(false);
      const { show } = useToast();
    
      const folderName = `[MOGZ] ${title}`;
      const zip = new JSZip();
      const folder = zip.folder(folderName);
    
      const showToast = (
        message: string,
        status: 'success' | 'error',
        autoClose: boolean = true
      ) => {
        show(message, { status, autoClose });
      };
    
      const checkRateLimit = async (id: string): Promise<boolean> => {
        const response = await fetch(`/api/rateLimit?id=${id}`, {
          method: 'GET',
        });
        if (!response.ok) {
          const { message } = await response.json();
          console.log('Rate limit status:', response.status, message);
          showToast('Rate limit exceeded, please try again later.', 'error', false);
          return false;
        }
        return true;
      };
    
      const addEmailToAudience = async (email: string) => {
        try {
          const response = await fetch('/api/contact/audience', {
            method: 'POST',
            headers: {
              'Content-Type': 'application/json',
            },
            body: JSON.stringify({ email }),
          });
          console.log(response);
        } catch (error) {
          console.log(error);
        }
      };
    
      const fetchImages = async () => {
        const gallery: string[] = await fetchSanityData(
          getPrivateCollectionGallery,
          { id: uniqueId }
        );
        return gallery;
      };
    
      const downloadImages = async (email: string) => {
        setLoading(true);
        let images = gallery;
        try {
          if (!(await checkRateLimit('download'))) return;
    
          await addEmailToAudience(email);
    
          if (!images) {
            images = await fetchImages();
          }
    
          const imageFetchPromises = images.map(async (image, index) => {
            try {
              const response = await fetch(image);
              if (!response.ok) {
                throw new Error(
                  `Failed to fetch image at index ${index}, status: ${response.status}`
                );
              }
              const blob = await response.blob();
              if (!folder) {
                throw new Error('folder is undefined');
              }
              folder.file(generateImageName(title, index), blob, { binary: true });
            } catch (err) {
              console.error(`Error fetching image at index ${index}:`, err);
              throw err;
            }
          });
    
          await Promise.all(imageFetchPromises);
    
          console.log('Adding images done, proceeding to ZIP...');
          const content = await zip.generateAsync({ type: 'blob' });
    
          saveAs(content, `${folderName}.zip`);
          showToast('Collection downloaded successfully!', 'success');
        } catch (err: any) {
          console.error(err);
          showToast(
            `An error occurred while downloading the collection! Try again later.`,
            'error'
          );
        } finally {
          setLoading(false);
        }
      };
    
      return {
        loading,
        downloadImages,
      };
    };
    
    export default useDownloadCollection;
    
    const generateImageName = (title: string, index: number): string => {
      const formattedTitle = title.replace(/\s/g, '-');
      return `[MOGZ]-${formattedTitle}-${index + 1}.jpg`;
    };
    

    zip.worker.ts

    import * as Comlink from 'comlink';
    import JSZip from 'jszip';
    
    const zipImages = async (
      imageUrls: string[],
      collectionTitle: string,
      onProgress?: (progress: number) => void
    ) => {
      const zip = new JSZip();
      const folder = zip.folder(collectionTitle);
    
      if (!folder) {
        throw new Error('Could not create folder in zip.');
      }
    
      let imagesProcessed = 0;
    
      for (const imageUrl of imageUrls) {
        try {
          const response = await fetch(imageUrl);
          if (!response.ok) {
            throw new Error(`Failed to fetch ${imageUrl}: ${response.statusText}`);
          }
          const blob = await response.blob();
          const filename = imageUrl.substring(imageUrl.lastIndexOf('/') + 1);
          folder.file(filename, blob);
    
          imagesProcessed++;
          if (onProgress) {
            onProgress((imagesProcessed / imageUrls.length) * 100);
          }
        } catch (error) {
          console.error(`Error processing image ${imageUrl}:`, error);
          // Continue with other images even if one fails
        }
      }
    
      const content = await zip.generateAsync({ type: 'blob' });
      return content;
    };
    
    const api = {
      zipImages: Comlink.proxy(zipImages),
    };
    
    Comlink.expose(api);
    

    useDownloadCollection.ts

    // src/lib/hooks/useDownloadCollection.ts
    // ... imports
    const CHUNK_SIZE = 100;
    
    // Logic to split the massive collection into manageable 100-image chunks
    const numChunks = imageCount ? Math.ceil(imageCount / CHUNK_SIZE) : 0;
    const newSegments = Array.from({ length: numChunks }, (_, i) => {
      const start = i * CHUNK_SIZE;
      let end = start + CHUNK_SIZE;
      if (end > imageCount) end = imageCount;
      return { start, end };
    });
    setSegments(newSegments);
    
    // The execution now delegates to the worker
    const _zipAndSave = async (images: string[], segmentIndex: number) => {
      const worker = new Worker(
        new URL('../workers/zip.worker.ts', import.meta.url)
      );
      const workerApi = Comlink.wrap<any>(worker);
      // ... trigger worker and saveAs
    };

    useDownloadCollection.ts

    import JSZip from 'jszip';
    import { useState } from 'react';
    import { saveAs } from 'file-saver';
    import { useToast } from '../context/ToastContext';
    import { fetchSanityData } from '../sanity/client';
    import { getPrivateCollectionGallery } from '../sanity/queries';
    import { COLLECTION } from '../types';
    
    const useDownloadCollection = ({ title, uniqueId, gallery }: COLLECTION) => {
      const [loading, setLoading] = useState(false);
      const { show } = useToast();
    
      const folderName = `[MOGZ] ${title}`;
      const zip = new JSZip();
      const folder = zip.folder(folderName);
    
      const showToast = (
        message: string,
        status: 'success' | 'error',
        autoClose: boolean = true
      ) => {
        show(message, { status, autoClose });
      };
    
      const checkRateLimit = async (id: string): Promise<boolean> => {
        const response = await fetch(`/api/rateLimit?id=${id}`, {
          method: 'GET',
        });
        if (!response.ok) {
          const { message } = await response.json();
          console.log('Rate limit status:', response.status, message);
          showToast('Rate limit exceeded, please try again later.', 'error', false);
          return false;
        }
        return true;
      };
    
      const addEmailToAudience = async (email: string) => {
        try {
          const response = await fetch('/api/contact/audience', {
            method: 'POST',
            headers: {
              'Content-Type': 'application/json',
            },
            body: JSON.stringify({ email }),
          });
          console.log(response);
        } catch (error) {
          console.log(error);
        }
      };
    
      const fetchImages = async () => {
        const gallery: string[] = await fetchSanityData(
          getPrivateCollectionGallery,
          { id: uniqueId }
        );
        return gallery;
      };
    
      const downloadImages = async (email: string) => {
        setLoading(true);
        let images = gallery;
        try {
          if (!(await checkRateLimit('download'))) return;
    
          await addEmailToAudience(email);
    
          if (!images) {
            images = await fetchImages();
          }
    
          const imageFetchPromises = images.map(async (image, index) => {
            try {
              const response = await fetch(image);
              if (!response.ok) {
                throw new Error(
                  `Failed to fetch image at index ${index}, status: ${response.status}`
                );
              }
              const blob = await response.blob();
              if (!folder) {
                throw new Error('folder is undefined');
              }
              folder.file(generateImageName(title, index), blob, { binary: true });
            } catch (err) {
              console.error(`Error fetching image at index ${index}:`, err);
              throw err;
            }
          });
    
          await Promise.all(imageFetchPromises);
    
          console.log('Adding images done, proceeding to ZIP...');
          const content = await zip.generateAsync({ type: 'blob' });
    
          saveAs(content, `${folderName}.zip`);
          showToast('Collection downloaded successfully!', 'success');
        } catch (err: any) {
          console.error(err);
          showToast(
            `An error occurred while downloading the collection! Try again later.`,
            'error'
          );
        } finally {
          setLoading(false);
        }
      };
    
      return {
        loading,
        downloadImages,
      };
    };
    
    export default useDownloadCollection;
    
    const generateImageName = (title: string, index: number): string => {
      const formattedTitle = title.replace(/\s/g, '-');
      return `[MOGZ]-${formattedTitle}-${index + 1}.jpg`;
    };
    

    zip.worker.ts

    import * as Comlink from 'comlink';
    import JSZip from 'jszip';
    
    const zipImages = async (
      imageUrls: string[],
      collectionTitle: string,
      onProgress?: (progress: number) => void
    ) => {
      const zip = new JSZip();
      const folder = zip.folder(collectionTitle);
    
      if (!folder) {
        throw new Error('Could not create folder in zip.');
      }
    
      let imagesProcessed = 0;
    
      for (const imageUrl of imageUrls) {
        try {
          const response = await fetch(imageUrl);
          if (!response.ok) {
            throw new Error(`Failed to fetch ${imageUrl}: ${response.statusText}`);
          }
          const blob = await response.blob();
          const filename = imageUrl.substring(imageUrl.lastIndexOf('/') + 1);
          folder.file(filename, blob);
    
          imagesProcessed++;
          if (onProgress) {
            onProgress((imagesProcessed / imageUrls.length) * 100);
          }
        } catch (error) {
          console.error(`Error processing image ${imageUrl}:`, error);
          // Continue with other images even if one fails
        }
      }
    
      const content = await zip.generateAsync({ type: 'blob' });
      return content;
    };
    
    const api = {
      zipImages: Comlink.proxy(zipImages),
    };
    
    Comlink.expose(api);
    

    useDownloadCollection.ts

    // src/lib/hooks/useDownloadCollection.ts
    // ... imports
    const CHUNK_SIZE = 100;
    
    // Logic to split the massive collection into manageable 100-image chunks
    const numChunks = imageCount ? Math.ceil(imageCount / CHUNK_SIZE) : 0;
    const newSegments = Array.from({ length: numChunks }, (_, i) => {
      const start = i * CHUNK_SIZE;
      let end = start + CHUNK_SIZE;
      if (end > imageCount) end = imageCount;
      return { start, end };
    });
    setSegments(newSegments);
    
    // The execution now delegates to the worker
    const _zipAndSave = async (images: string[], segmentIndex: number) => {
      const worker = new Worker(
        new URL('../workers/zip.worker.ts', import.meta.url)
      );
      const workerApi = Comlink.wrap<any>(worker);
      // ... trigger worker and saveAs
    };
    import Cookies from 'js-cookie';
    import { useEffect, useState } from 'react';
    import { useRouter } from 'next/navigation';
    import { useToast } from '../context/ToastContext';
    
    export const useAutoDeleteCookie = (slug: string, isPrivate: boolean) => {
      const [decryptedSlug, setDecryptedSlug] = useState<string | null>(null);
      const { show } = useToast();
      const router = useRouter();
    
      useEffect(() => {
        if (isPrivate) {
          const func = async () => {
            const encryptedCookie = Cookies.get('collectionAccess');
            if (encryptedCookie) {
              const parsedCookie = JSON.parse(encryptedCookie);
              console.log('encrypted slug', parsedCookie.slug);
              const decryptedSlug = await getDecryptedSlug(parsedCookie.slug);
    
              setDecryptedSlug(decryptedSlug);
            }
          };
          // Decrypt the cookie and set the state
          func();
        }
      }, [isPrivate]);
    
      useEffect(() => {
        if (isPrivate && decryptedSlug) {
          const timer = setTimeout(() => {
            if (slug === decryptedSlug) {
              Cookies.remove('collectionAccess');
              router.push('/gallery');
              show('Your access to private collection expired!', {
                status: 'info',
                autoClose: false,
              });
            }
          }, 1 * 60 * 60 * 1000); // 1 hour in milliseconds
    
          // Listen to the 'beforeunload' event to delete the cookie when the tab is closed
          const handleBeforeUnload = async () => {
            if (slug === decryptedSlug) {
              Cookies.remove('collectionAccess');
            }
          };
          window.addEventListener('beforeunload', handleBeforeUnload);
    
          return () => {
            clearTimeout(timer);
            window.removeEventListener('beforeunload', handleBeforeUnload);
          };
        }
      }, [decryptedSlug, isPrivate, router, show, slug]);
    };
    
    const getDecryptedSlug = async (encryptedCookie: string) => {
      const response = await fetch('/api/decryptCookie', {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json',
        },
        body: JSON.stringify({ encryptedCookie }),
      });
      const { decryptedSlug } = await response.json();
    
      return decryptedSlug;
    };
    import Cookies from 'js-cookie';
    import { useEffect, useState } from 'react';
    import { useRouter } from 'next/navigation';
    import { useToast } from '../context/ToastContext';
    
    export const useAutoDeleteCookie = (slug: string, isPrivate: boolean) => {
      const [decryptedSlug, setDecryptedSlug] = useState<string | null>(null);
      const { show } = useToast();
      const router = useRouter();
    
      useEffect(() => {
        if (isPrivate) {
          const func = async () => {
            const encryptedCookie = Cookies.get('collectionAccess');
            if (encryptedCookie) {
              const parsedCookie = JSON.parse(encryptedCookie);
              console.log('encrypted slug', parsedCookie.slug);
              const decryptedSlug = await getDecryptedSlug(parsedCookie.slug);
    
              setDecryptedSlug(decryptedSlug);
            }
          };
          // Decrypt the cookie and set the state
          func();
        }
      }, [isPrivate]);
    
      useEffect(() => {
        if (isPrivate && decryptedSlug) {
          const timer = setTimeout(() => {
            if (slug === decryptedSlug) {
              Cookies.remove('collectionAccess');
              router.push('/gallery');
              show('Your access to private collection expired!', {
                status: 'info',
                autoClose: false,
              });
            }
          }, 1 * 60 * 60 * 1000); // 1 hour in milliseconds
    
          // Listen to the 'beforeunload' event to delete the cookie when the tab is closed
          const handleBeforeUnload = async () => {
            if (slug === decryptedSlug) {
              Cookies.remove('collectionAccess');
            }
          };
          window.addEventListener('beforeunload', handleBeforeUnload);
    
          return () => {
            clearTimeout(timer);
            window.removeEventListener('beforeunload', handleBeforeUnload);
          };
        }
      }, [decryptedSlug, isPrivate, router, show, slug]);
    };
    
    const getDecryptedSlug = async (encryptedCookie: string) => {
      const response = await fetch('/api/decryptCookie', {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json',
        },
        body: JSON.stringify({ encryptedCookie }),
      });
      const { decryptedSlug } = await response.json();
    
      return decryptedSlug;
    };
    import { useState, useEffect, useRef, useCallback } from 'react';
    import { useScroll } from '../context/scrollContext';
    import { COLLECTION } from '../types';
    
    const PAGE_SIZE = 20;
    
    export const useInfiniteScroll = (collection: COLLECTION) => {
      const { scrollInstance } = useScroll();
      const [images, setImages] = useState(collection.gallery || []);
      const [isLoading, setIsLoading] = useState(false);
      const [hasMore, setHasMore] = useState(
        collection.imageCount > (collection.gallery?.length || 0)
      );
      const page = useRef(1);
    
      const stateRef = useRef({ isLoading, hasMore });
      useEffect(() => {
        stateRef.current = { isLoading, hasMore };
      }, [isLoading, hasMore]);
    
      const fetchMoreImages = useCallback(async () => {
        if (stateRef.current.isLoading || !stateRef.current.hasMore) return;
    
        setIsLoading(true);
        const start = page.current * PAGE_SIZE;
        const end = start + PAGE_SIZE;
    
        const collectionId = collection.isPrivate
          ? collection.uniqueId
          : collection.slug?.current;
    
        try {
          const response = await fetch(
            `/api/gallery?collectionId=${collectionId}&isPrivate=${collection.isPrivate}&start=${start}&end=${end}`
          );
          const newImages = await response.json();
    
          if (newImages.length > 0) {
            setImages((prev) => [...prev, ...newImages]);
            page.current += 1;
          }
    
          if (newImages.length < PAGE_SIZE) {
            setHasMore(false);
          }
        } catch (error) {
          console.error('Failed to fetch more images:', error);
        } finally {
          setIsLoading(false);
        }
      }, [collection]);
    
      useEffect(() => {
        if (!scrollInstance) return;
    
        const callHandler = (func: string | string[]) => {
          const processFunc = (f: string) => {
            if (f === 'fetchMore') {
              fetchMoreImages();
            }
          };
    
          if (Array.isArray(func)) {
            func.forEach(processFunc);
          } else {
            processFunc(func);
          }
        };
    
        scrollInstance.on('call', callHandler);
      }, [scrollInstance, fetchMoreImages]);
    
      useEffect(() => {
        if (scrollInstance) {
          const timer = setTimeout(() => scrollInstance.update(), 200);
          return () => clearTimeout(timer);
        }
      }, [images, scrollInstance]);
    
      return { images, isLoading, hasMore };
    };
    
    import { useState, useEffect, useRef, useCallback } from 'react';
    import { useScroll } from '../context/scrollContext';
    import { COLLECTION } from '../types';
    
    const PAGE_SIZE = 20;
    
    export const useInfiniteScroll = (collection: COLLECTION) => {
      const { scrollInstance } = useScroll();
      const [images, setImages] = useState(collection.gallery || []);
      const [isLoading, setIsLoading] = useState(false);
      const [hasMore, setHasMore] = useState(
        collection.imageCount > (collection.gallery?.length || 0)
      );
      const page = useRef(1);
    
      const stateRef = useRef({ isLoading, hasMore });
      useEffect(() => {
        stateRef.current = { isLoading, hasMore };
      }, [isLoading, hasMore]);
    
      const fetchMoreImages = useCallback(async () => {
        if (stateRef.current.isLoading || !stateRef.current.hasMore) return;
    
        setIsLoading(true);
        const start = page.current * PAGE_SIZE;
        const end = start + PAGE_SIZE;
    
        const collectionId = collection.isPrivate
          ? collection.uniqueId
          : collection.slug?.current;
    
        try {
          const response = await fetch(
            `/api/gallery?collectionId=${collectionId}&isPrivate=${collection.isPrivate}&start=${start}&end=${end}`
          );
          const newImages = await response.json();
    
          if (newImages.length > 0) {
            setImages((prev) => [...prev, ...newImages]);
            page.current += 1;
          }
    
          if (newImages.length < PAGE_SIZE) {
            setHasMore(false);
          }
        } catch (error) {
          console.error('Failed to fetch more images:', error);
        } finally {
          setIsLoading(false);
        }
      }, [collection]);
    
      useEffect(() => {
        if (!scrollInstance) return;
    
        const callHandler = (func: string | string[]) => {
          const processFunc = (f: string) => {
            if (f === 'fetchMore') {
              fetchMoreImages();
            }
          };
    
          if (Array.isArray(func)) {
            func.forEach(processFunc);
          } else {
            processFunc(func);
          }
        };
    
        scrollInstance.on('call', callHandler);
      }, [scrollInstance, fetchMoreImages]);
    
      useEffect(() => {
        if (scrollInstance) {
          const timer = setTimeout(() => scrollInstance.update(), 200);
          return () => clearTimeout(timer);
        }
      }, [images, scrollInstance]);
    
      return { images, isLoading, hasMore };
    };