4.07.2026

AI within DDD ( Domain Driven Development )

In some projects I've participated in, and even created and built directly, I always thought about automating the generation of components, whether in Java, Python, or even JS Backend. It's like creating cartridges and reusing them for construction. After all, computer programming accepts and receives actions and concepts from engineering and architecture. 

Today, I and we see VibeCode as something very new, but the concept isn't new . This new name is the same as what we used years ago for development acceleration or reuse tools. What AI can help with will require scaling effort in architecture and high- and medium-complexity integrations. For simpler projects, we can use AI extensively in software engineering. 

In programming, what we call and use DDD – Domain Driven Development – ​​has gained a new name, with prediction platforms along with reusable development or implementation cartridges in various languages, but with reusable, registered, and typed logic, almost like a program within a program, you understand? 

Vibe coding — prompt, paste, and hope — is great for quick prototypes.  ONLY prototypes !

For larger projects, it quickly breaks down. 

The fix isn’t “more AI,” it’s better structure. Treat AI like a junior developer: give it context, guardrails, and a clear spec. That’s Spec-Driven Development — and it’s a game changer.

Vibe Coding is FUN ? ...Until It Isn’t

When I first discovered AI coding tools, I fell into what I call the “vibe coding” phase. Need a login page? Prompt the AI. Need styling? Prompt again. Need to fix an error? Another prompt.

At first, using Cursor felt like magic. One or two prompts and boom — working code. 

My entire job seemed to be clicking the “Accept All” button. I felt like I’d discovered some kind of coding cheat code.

For about two weeks, I moved fast, building features that somehow worked. This approach is actually perfect for prototypes and demos — you can go from idea to working concept in hours.

But as my project grew, the cracks appeared.

When the Magic Breaks Down

The AI started making expensive mistakes. It would rewrite entire files incorrectly, create functions that already existed, break existing functionality while adding new features. Each new feature came back with different state management, inconsistent styling, and duplicated APIs.

My app started to look like a group project where nobody spoke to each other.

Not only did this slow down development, but it was burning through money fast. All those API calls for code I’d have to throw away and regenerate.

That’s when I realized I had to start reading the code myself. I couldn’t just blindly click “Accept All” anymore. This forced me to learn TypeScript and CSS — which was terrifying at first, but also meant I could start pointing the AI to specific files: “Look at this existing function” or “Follow the pattern in this component.”

The problem wasn’t the AI — it was me.

The great Shift to Specs

In my frustration, I started reading about real software development practices: requirements gathering, design docs, architecture planning. The “serious” stuff I had always thought was overkill for side projects.

Wait. Actually, as a seasoned backend engineer, I already knew this stuff. I’d written design docs for distributed systems, gathered requirements for API features, planned database schemas. The practices weren’t foreign to me — I just hadn’t thought to apply them to frontend development.

Then it hit me: what if I applied these methods, but directed them toward AI?

I began treating AI like a capable but inexperienced teammate. Someone who’s technically skilled but needs proper direction and context.

Building Structure Into AI Development

So I started experimenting. I wrote down what I wanted: the problem, the users, what success looked like, how the feature should fit into the existing system. Then I gave the AI the context it needed: existing APIs, data structures, navigation flow, and design patterns.

The results were night and day.

Instead of vague prompts like “build user authentication,” I’d write detailed specifications:

  • What the feature accomplishes and why
  • How users interact with it step by step
  • Technical requirements and integration points
  • Clear acceptance criteria for “done”

Then I’d include project context: how my state management worked, what API patterns I was using, how I organized components.

The Context Breakthrough

Maintaining context became the single most important thing. Instead of letting the AI improvise, I created a living document of how my app worked — existing APIs, page structure, common types, styling patterns. 

It was like an employee handbook for the AI.

Keep the specification alive, the build of your project, your solution alive and updated, thinking about the practice of real software architecture and implementation using AI in conjunction with it, not as an end in itself.

Many say nowadays that AI will replace AI, but those who say this don't know what they're talking about, because those who can maintain continuous and high-quality automation are the programmers, it's us. Those who believe that developing software without functional and non-functional specifications and simply pressing a button will be lost, I'm sorry to say.


Spec DDD = Specification Domain Driven Developement



11.25.2025

Is possible VSAM access with JS? yes !

 Is possible use JS and access VSAM, really one environment IBM/zOS with VSAM , CICS , COBOL , yes, we can !

One module enables an application to read and modify VSAM datasets on z/OS.  NPM have one package, with name vsam.js  

Please, attetion : before installing, download and install IBM SDK for Node.js - z/OS.

Node.js 8.16.2 or higher is required.

=================================================

vsamObj = vsam.openSync("sample.test.vsam.ksds",
                          JSON.parse(fs.readFileSync("schema.json")));

  // Find using a string as key:
  vsamObj.find("0321", (record, err) => {
    if (record !== null) {
      assert.equal(record.key, "0321");
      record.name = "JACOB";
      record.quantity = Buffer.from([0xe5, 0xf6, 0x78, 0x9a]).toString("hex");
      vsamObj.update(record, (err) => {
        if (err !== null)
          console.log("update was successful");
        else
          console.error(err);
        vsamObj.close();
      });
    } else {
      console.error(err);
    }
  });

  // or find using binary data as key (type must be set to "hexadecimal"):
  const keybuf = Buffer.from([0xa1, 0xb2, 0xc3, 0xd4]);
  vsamObj.find(keybuf, keybuf.length, (record, err) => {
    ...

  // or find using a hexadecimal string as key (type must be set to "hexadecimal"):
  vsamObj.find("e5f6789a", (record, err) => {
    ...

  // Starting with vsam.js v3.0.0, find and update record(s) in one call synchronously:
  count = vsamObj.updateSync("f1f2f3", record);

  // or find and delete record(s) in one call:
  count = vsamObj.deleteSync("f1f2f3");

  // or find and update record(s) in one call asynchronously:
  vsamObj.update("f1f2f3", record, (count, err) => { ... });
Schema.json 
Here contains the dataset's field names and their attributes
{
  "key": {
    "type": "hexadecimal",
    "maxLength": 8
  },
  "name": {
    "type": "string",
    "maxLength": 10,
    "minLength": 1
  },
  "quantity": {
    "type": "hexadecimal",
    "maxLength": 4
  }
}




10.27.2025

How to sell online : 8 steps to start selling online ( We recommend )

We recommend

1 – Identify your target audience

First and foremost, it's essential to identify your potential customers.
Knowing your target audience involves analyzing demographic characteristics such as age, gender, geographic location, income level, and other factors relevant to your market niche.
I also study this audience's purchasing behavior to optimize the shopping experience in your online store.
You need to know where your potential customers search for products, which communication channels they prefer, and what factors influence their purchasing decisions.


2 – Choose the right e-commerce platform

At this stage, it's important to choose an e-commerce platform that meets your business needs. 
Make sure it's easy to use, offers customization features, and reliable technical support.
If you're serious about learning how to sell online and truly entering this market, you need a truly personal space online.
This is the great advantage of having a website or online store, even if you already have social media profiles and other tools.
We recommend eCommerce Hamech, a comprehensive platform ready to help you from the beginning to the very top of your business.
With it, you'll have access to several essential features and integrations, such as shipping, payment, and marketing platforms.
In addition, eCommerce Hamech provides professional support through chat and live support rooms, where representatives are ready to answer questions and resolve user issues.

3 – Create an attractive visual identity

To create an attractive visual identity, it's important to consider elements that complement each other and convey the business's message and personality, fostering a connection between the public and the brand.

4 – Plan your offers

Consider how you will promote your offers. Create strong arguments and highlight customer testimonials and results.
All of this should create greater confidence in the end consumer, increasing your chances of a sale.
When a customer needs something, they're most likely willing to pay your price if your solution delivers what they need.

5 – Focus on Omnichannel

Applying an omnichannel strategy involves integrating a company's sales operations, whether it's inventory, sales, or data from a physical store, e-commerce, or any other sales channel used.
This results in a better customer experience, encouraging these people to return to your store in the future. See how omnichannel works in practice:
A customer buys a product on your website and picks it up at the physical store;
A customer adds a product to their e-commerce cart and completes the purchase through the app, without losing the saved product;
A customer visits the physical store, and their e-commerce information is already recorded, making the sales team's job easier;
A customer buys an item of clothing online, but it doesn't fit, so they exchange it at the physical store.
See how seamlessly customers can transition between your store's sales channels?

6 – Value After-Sale Service

It's important to maintain a post-sale relationship with your customers, as your brand's reputation depends on it.
According to Philip Kotler, author of Marketing Management, considered the good book of modern marketing, acquiring a new customer can cost up to seven times more than retaining an existing one.
Imagine a customer who buys something from you, and then you simply disappear. It's not a very positive experience.
If you do a good job, these people will likely buy from you again in the future. Furthermore, they may become advocates for your brand.

7 – Serve your audience online

Agile online customer service that solves problems without creating additional bureaucracy is crucial for building customer loyalty.

Therefore, when thinking about selling, organize your contact channels, such as:

Email;
Phone;
Social media;
Website.
Having a dedicated team to handle online sales and customer service can be a great way to make your business even more assertive, avoiding future dissatisfaction.

8 – Invest in Content Marketing

Another important concept for those who want to learn how to sell online is Content Marketing. Content marketing, in and of itself, is any sales or advertising strategy based on the creation and distribution of content.

On the internet, corporate blogs are a good example, as are brand profiles that engage users on social media.

Content can be images, text, videos, and more. Furthermore, it's not interruptive like traditional ads.

By using Content Marketing, you attract people who are interested in your business's topic—in other words, potential customers. With each piece of content consumed, consumer trust in your brand increases.







10.15.2025

The COBOL with power still ...COBOL is alive !

 Twenty years ago, I wrote a Python component that integrates with IBM-MQ and uses a response structure in IBM COBOL/DB2. It accesses DB2 data and a VSAM database to return a huge string to be structured on a screen of an e-commerce website for a Brazilian airline. Twenty years ago… It was the beginning of my journey into high-end computing and what they call the cloud, and I never imagined that, in 2025, this language created in the 1950s would still support the backbone of the digital economy. It still supports a lot!

COBOL alive ! 

While Python, JavaScript, and Go dominate the headlines, COBOL continues to process trillions of dollars a day. Quietly, with the same reliability as ever.

In the 1950s, programming was chaotic. Each manufacturer had its own language, and changing machines required rewriting everything from scratch. That's when the US Department of Defense called on industry and academia to create a common, readable, and portable language. COBOL (Common Business Oriented Language) was born, with the brilliant Grace Hopper championing the idea that code should look like English, not math. Imagine???

The result? A language any business analyst could read and that quickly became the universal language of corporate mainframes.

From the 1960s to the 1980s, COBOL reigned supreme in banks, governments, and large corporations. At its peak, 80% of the world's business systems were written in it. Today, in the midst of the AI ​​era, the numbers are still impressive, with estimates indicating that between 220 and 850 billion lines of COBOL remain active. US$3 trillion is processed every day using COBOL code, and 95% of ATMs and 80% of card transactions depend on it. In 2022, in the US alone, banks spent US$36.7 billion on legacy system maintenance, a figure expected to exceed US$57 billion by 2028!

What explains this longevity? Several factors, such as stability and performance: COBOL systems are robust and virtually immune to failure. The cost of replacement, as rewriting millions of lines is risky and extremely expensive, is high. And the embedded knowledge, when much of the business rules live solely within these systems.

Far from being a museum piece, COBOL has evolved. Recent versions support object orientation, JSON, XML, and REST APIs. Today, it runs on hybrid clouds, communicates with Java, Python, and C#, and can be accessed via microservices. The trend isn't to erase COBOL, but to encapsulate, refactor, and integrate it, preserving decades of business logic with new digital layers.

COBOL isn't seen on tech conference stages, but it holds the stage. And the fact that COBOL is still here, 65 years later, is perhaps the greatest testament to software engineering the world has ever seen, coexisting with JS, Python, and Java! To those new to the market, I hope one day to see the legacy operating as it does today.

Think,  the MAINFRAME concept is today CLOUD Computing !










9.14.2025

Need to migrate your current ERP database to the cloud or a totally new ERP platform?

 

Need to migrate your current ERP database to the cloud or a totally new ERP platform?

If you need to migrate your current ERP database to the cloud or a totally new ERP platform, all kinds of challenges can come into play and wreak havoc on the accuracy and completeness of your transferred data. Time, cost, data redundancy and integrity issues, stakeholder support and potential regulatory concerns are just some of these hurdles.

While Fostering a Continuous Reporting Environment for Users

How can you and your company ensure that data isn't being leaked to competitors?

Did your ERP pass the recovery environment test?

If you're splitting your ERP workflow into layers, be careful with a single ERP solution provider!

On the positive side, though, a new ERP implementation or move to the cloud provides an opportunity for you to sort out the data you really need to keep.  Leveraging a data hub can help you do so by storing and managing historical data like AP/AR, purchasing and sales history while the implementation is taking place.  The beauty of incorporating a data hub as part of a data migration strategy is that it can dramatically reduce the volume of data that you will ultimately need to load into your new ERP.  Plus, with your history in the data hub, business analysts and others can drive their analytics and reports in an uninterrupted manner while the new ERP implementation is underway and long afterwards.

But even if you don’t have a data hub now, many of the considerations you’ll need to make during your ERP migration are the same as the ones you would need to address when putting an enterprise data hub and reporting platform into place. 

This article covers all of these considerations and provides some proven best practices to help guide you in a successful migration of data from your current to new ERP system.

Why Is Data Migration So Important to a New ERP Implementation or Move to the Cloud?

The data migration process is critical to ensuring that the data in a new ERP system is accurate and complete.  This is vital because many people across the business will rely on that historical data.

A well-planned data migration strategy can help keep the entire ERP implementation project on time and on budget while addressing challenges like these:

Data Duplication

Multiple departments may store their own copies of information about the same customers or products in an ERP system, but their data may not be identical. For example, customer names and addresses may be stored in different formats or with varying addresses for the same customer .  And for multiple ERP instances, different customer or product numbers for the same customers and products may exist.  The list goes on!  If you simply import every record from each business unit into the ERP database, you could end up with tons of duplications and inaccuracies.  A migration strategy that includes a data hub for consolidating and storing this historical data can help overcome such redundancy and integrity obstacles.

Migration Costs

The cost of extracting, cleansing and restructuring data represents a significant percentage of the overall ERP migration budget. Such costs are a big reason why we are seeing more businesses invest in the use corporate data hubs as part of their migration projects.  This is particularly true for organizations that face very high migration costs due to the overwhelming volumes of data they need to manage.

Stakeholder Buy-in

As previously mentioned, various business units may use their own disparate systems to support their specific needs. That’s why it is necessary to involve management to ensure that everyone cooperates to produce a single, consistent set of data.

I can’t underscore enough how important it is to have proper executive buy-in during a data migration project – both at the beginning of it and throughout the process.  

Successful clients, for example, include regular executive involvement in some form or another such as regularly scheduled status meetings, budget reviews, or general exposure to the migration process by C- and V-level users across the organization.

 






Data Migration Best Practices

To guide the organization through the often-complex process of moving data to a new ERP platform or cloud-based deployment, it’s important to create an ERP data migration strategy that encompasses several key elements.

Create a migration team

To ensure a successful data migration, the process should start early to avoid delaying the ERP deployment. We recommend to our clients that they dedicate a team to analyzing the data, performing the migration and validating the results. The planning team can decide on what to include vs. exclude and the timing of what data should be migrated to the new ERP (and what should be stored in the data hub until the actual ERP implementation takes place if that’s the route they wish to take).

The data migration team is typically part of the overall ERP implementation team. Representatives from different business groups who can provide insights into how data is used by their respective business units or departments should be included on the team.

A good planning team like this can make a huge impact on a data migration project’s success. The team should include stakeholders – not just top/upper management.  

You also need individuals that understand the data, the processes in which the data is used, and the reports your company will need following the migration.  We recommend to our clients that they create an inventory of reports used on a monthly, weekly, daily basis to assist in this planning segment of their migration journeys.  By doing so, the data used in those reports can be identified and included in the ERP migration plan much more easily.

Consider how you want to use your business data

Before starting the migration, spend time assessing your existing data, thinking about how it will be used within the ERP system, mapping it to the new ERP database, and setting up rules for translating the data to the new database structure during migration.

An ERP implementation is an opportunity to gain better insights into the business in real time by analyzing its data. So, when migrating, think about how data will be used for decision-making across the whole business, as well as by each department.  Many of Silvon’s customers generally want to attack their worst business pains first and the corresponding data sets are the ones we help store and manage for them within our data hub environment from the very beginning of their ERP migration projects.

There are many approaches for considering the broader use of business data outside of the main ERP database, too. 

How much history should be carried over?  

What should be migrated first?  

A data hub platform can offer up proven best practices and techniques for importing and managing history from a single ERP, multiple instances of an ERP, and different ERPs that may be deployed across an enterprise.

Assign data governance responsibilities

Determine who owns which data and assign roles to your team. For example, the team will have to determine which version of redundant customer information is correct and should be incorporated into the ERP system and corporate data hub if you use one. Now is also a good time to appoint someone with overall responsibility for compliance with any regulations that affect your business. Data governance also works well on the front end when building and managing analytics and reports based on your ERP data.

Less is More  Be selective with the data you migrate

You may be tempted to import every piece of data from your old ERP into your new system, but not all historical data is useful or needs to be immediately accessible. In fact, importing every historical data item can slow system performance and make it harder for users to find the information they need.

Having a smaller quantity of trusted data makes migration faster and less risky. This approach also allows you to deliver critical information in a very tight schedule before go-live. As a fallback, you may elect to store historical data that won’t be migrated in a separate system like a data hub should you need it for analysis or other uses later.

Strongly consider a data hub to streamline your migration efforts

As mentioned earlier, taking advantage of a new or existing data hub during the migration to a new ERP can streamline your migration effort while ensuring that your reporting initiatives continue smoothly as the new implementation takes place.

Keep in mind that while the volume of data you’ll need to load into your new ERP will be reduced using a data hub, you will still need to convert old codes to new codes in the new ERP – such as customer, product, product categories, etc. based on the ERP’s requirements for the data types and format of the codes.  Fortunately, you can use the Master Data in your data hub to maintain your old codes and to “seed” Master Data in your new ERP.

If possible, consider acquiring and bringing a data hub solution on-line BEFORE you begin your ERP implementation.  You’ll be in a much better position to tackle your data migration with far fewer headaches by doing so!

Consolidating your data may be beneficial

To understand what transformation is required to prepare legacy data for upload, you should have a clear vision of the results you’re expecting. 

Which data is needed for critical reports? How many details must be there?

Will summary be good enough?

ERP systems allow you to bring consolidated information through journals that group legacy transactions based on certain criteria. You may want to consider this approach if you wish to minimize the amount of information you’ll need to upload to the new system to support your operations and reporting requirements.

Silvon has worked with several clients that consolidate their data. For these businesses, we typically globalize customer and product numbers as needed while also making the numbers unique to avoid any data summarization issues.  This ensures that the ERP data they bring into our data hub for planning, analysis and reporting provides the amount of detail they need.

Be sure to reconcile the data

How can you check that all information is correct when you migrate your transactional data to the new ERP

After all the transformations, you need to make sure that results can be trusted. For that reason, we recommend that a reconciliation stage be included in the migration plan.

Business users should be provided with reports of summarized data for their review. While reports may not match exactly because of new account structures, etc. in the new ERP system, the totals should match. The data migration team can then map your new accounts back to legacy ones to verify line details.

If you consolidate your data, you’ll find it easier to review and fix issues on the line level because once a successful reconciliation of one journal has been completed, most issues of data conversion can be discovered and the next journals will take a fraction of the time to migrate and verify.

If you decide prior to your new ERP implementation to employ a data hub to store your historical data during the migration, data balancing (or reconciliation) will play an important and integral part of the hub’s implementation, too.  Our clients typically find that the reconciliation process is much more successful when there’s involvement from migration team analysts as well.

Test early and often

Testing early and many times over the course of your ERP migration project can help to ensure its success.

  • Start testing your new system with small amounts of your migrated data as early as possible and gradually build up to more comprehensive testing over time. This strategy also works well if you decide to create a data hub for your analysts and other business users as part of your ERP migration strategy.
  • Start with representative subsets of customers and orders, and then gradually expand to cover all data, applications and uses.
  • Go through your checklist and run tests in which users go through their entire day-to-day processes on the new system.  This can expose problems that might otherwise be missed.
  • If you use a data hub to store your ERP reporting data, setting up a nightly load process will help you expose and resolve issues as well.


6.05.2025

Dangerous Narrative AI

 The dominant discourse around AIG, particularly LLMs, is tainted by a dangerous narrative, which is the misattribution of human capabilities such as “reasoning,” “thinking,” and “interpretability” to systems that, in essence, operate in radically different ways. The article below, in a lucid analysis, based on strong evidence, demonstrates that this tendency is misguided.


A closer understanding of the inner workings of these models reveals that the so-called “intermediate tokens,” the steps in information processing, are not manifestations of thought or reasoning. They are better explained by complex mathematical and statistical structures, visualized in elegant graphs, but fundamentally non-anthropomorphic. What is observed is, in reality, a process of prompt augmentation, highly dependent on formal checkers. These checkers are external mechanisms that guide and validate the process within specific and limited settings, not an intrinsic capability of the model.


The term “interpretability” has acquired a worrisome status. When applied to the analysis of these intermediate tokens without a robust causal basis and real verifiability, it is nothing more than an attempt to extract meaning from what is merely a complex statistical correlation. Just as the protrusions on a person’s skull do not reveal a person’s future or health status, the “interpretation” of an AI model’s internal signals lacks scientific foundation.


The anthropomorphization of terms like reasoning and interpretability is, in fact, a misappropriation of language to lend illusory credibility, attract investment, or mask technical limitations. And when adopted by consumers and enthusiasts, this same anthropomorphization often reflects a lack of access to technical criticism. It is an understandable mistake, but no less damaging for perpetuating myths.


The tendency to anthropomorphize sophisticated statistical systems is not benign. It obscures the true nature of the technology, opens the door to excessive hype, and diverts resources and attention from AI research beyond AIG. Recognizing that prompt augmentation guided by formal verifiers is at the core of current operation, and not a simulacrum of thought, is essential for sound and ethical AI development. The future of AI depends on the ability to distinguish between mathematical elegance and the illusion of consciousness.


* AIG - Artificial Inteligence Generative






AI within DDD ( Domain Driven Development )

In some projects  I've participated in, and even created and built directly, I always thought about automating the generation of compone...