Skip to main content

Arhivare/Dezarhivare in C#

Am avut zilele astea nevoie de a implementa o modalitate prin care sa dezarhivez un pachet zip intr-o anumita locatie. Toate bune si frumoase, am zis ca nu ar trebuii sa am nici o problema, GZipStream o sa functioneze perfect...
Dar nu a fost asa simplu, problema ca GZipStream stie sa manipuleze doar streamuri, iar daca ai un zip format dintr-o structura de directoare si fisiere nu prea ai sanse sa faca acest lucru out of the box.
O solutie era nevoie sa implementez acest mecanism. As fi putut face acest lucru, dar din punct de vedere a timpului depasea timpul pe care il aveam alocat pentru acest task.
Cautant o libratie deja scrisa, am gasit DotNetZip. O librarie destul de complexe, iti permite sa faci foarte multe lucruri, dar cu un API simplu si foarte usor de inteles.
De ce am ales aceasta solutie? Dupa ce am adăugato la referinta am luat copy/paste codul din exemplele pe care erau la ei pe site si a functionat fara probleme.
De exemplu pentru a adauga ceva in arhiva ajunge sa apelazi .Add(pathFisierului) sau sa indicati spre locatia unde este directorul, iar la sfarsit sa apelasi .Save(numeArhiva).
Iata codul de care am avut nevoie:
using (ZipFile zip = ZipFile.Read(locatieZipString))
{
foreach (ZipEntry fis in zip)
{
zip.Extract(locatieUndeSeCopieazaString);
}
}

Colectiile suporta si Linq, a.i. orice filtrare a fisierelor extrase se poate face foarte usor.
O alta functionalitate care mi-a placut este posibilitatea de a face update la un pachet, sau adaugare/stergere continut din pachet fara sa mai fie nevoie sa dezarhivam pachetul.

Daca o sa aveti vreodata nevoie de o solutie pentru arhivare/dezarhivare pachete zip, va recomand aceasta librarie.
Nota pe care o primeste este 9. Ar fi primit 9.50 daca avea posibilitatea sa dezarhivez un pachet, intr-o anumita locatie fara sa mai fiu obligat sa iterez prin fiecare intrare din pachet.

Comments

Popular posts from this blog

Why Database Modernization Matters for AI

  When companies transition to the cloud, they typically begin with applications and virtual machines, which is often the easier part of the process. The actual complexity arises later when databases are moved. To save time and effort, cloud adoption is more of a cloud migration in an IaaS manner, fulfilling current, but not future needs. Even organisations that are already in the cloud find that their databases, although “migrated,” are not genuinely modernised. This disparity becomes particularly evident when they begin to explore AI technologies. Understanding Modernisation Beyond Migration Database modernisation is distinct from merely relocating an outdated database to Azure. It's about making your data layer ready for future needs, like automation, real-time analytics, and AI capabilities. AI needs high throughput, which can be achieved using native DB cloud capabilities. When your database runs in a traditional setup (even hosted in the cloud), in that case, you will enc...

How to audit an Azure Cosmos DB

In this post, we will talk about how we can audit an Azure Cosmos DB database. Before jumping into the problem let us define the business requirement: As an Administrator I want to be able to audit all changes that were done to specific collection inside my Azure Cosmos DB. The requirement is simple, but can be a little tricky to implement fully. First of all when you are using Azure Cosmos DB or any other storage solution there are 99% odds that you’ll have more than one system that writes data to it. This means that you have or not have control on the systems that are doing any create/update/delete operations. Solution 1: Diagnostic Logs Cosmos DB allows us activate diagnostics logs and stream the output a storage account for achieving to other systems like Event Hub or Log Analytics. This would allow us to have information related to who, when, what, response code and how the access operation to our Cosmos DB was done. Beside this there is a field that specifies what was th...

[Post Event] Azure AI Connect, March 2025

On March 13th, I had the opportunity to speak at Azure AI Connect about modern AI architectures.  My session focused on the importance of modernizing cloud systems to efficiently handle the increasing payload generated by AI.