<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Daniel Lopes]]></title><description><![CDATA[Sharing my progress and learnings on startups, product development, AI, LLMs, web engineering, Ruby, Rails, management, and more.]]></description><link>https://journal.daniellopes.dev</link><generator>Substack</generator><lastBuildDate>Mon, 06 Apr 2026 14:21:58 GMT</lastBuildDate><atom:link href="https://journal.daniellopes.dev/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Daniel Lopes]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[daniellopes@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[daniellopes@substack.com]]></itunes:email><itunes:name><![CDATA[Daniel Lopes]]></itunes:name></itunes:owner><itunes:author><![CDATA[Daniel Lopes]]></itunes:author><googleplay:owner><![CDATA[daniellopes@substack.com]]></googleplay:owner><googleplay:email><![CDATA[daniellopes@substack.com]]></googleplay:email><googleplay:author><![CDATA[Daniel Lopes]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[EDD: Evals Driven Development]]></title><description><![CDATA[The ideal LLM-dev environment that I wish was easier to do&#8230;]]></description><link>https://journal.daniellopes.dev/p/edd-evals-driven-development</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/edd-evals-driven-development</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Sun, 10 Aug 2025 19:27:54 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/af2aaeee-069e-430f-b5cb-e1c1a02ac339_1456x832.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><strong>The Problem:</strong> Don't try to retrofit LLM testing into old paradigms. Testing pyramids, CI/CD gates, deterministic assertions - these were built for a world where <code>f(x) = y</code> every time. LLMs broke that contract.</p><h2>Core</h2><p>Traditional software has bugs. LLM software has <em>behaviors</em>.</p><p>You don't test behaviors - you observe them, measure them, and guide them. This requires a fundamentally different approach.</p><h2>Three Modes of Observation</h2><p><strong>Mode 1: Introspection</strong> The model evaluates itself during generation. Like a writer reviewing their draft before hitting send. Built-in confidence scoring, explanation generation, self-critique loops. This happens at inference time, costs tokens, but provides immediate guardrails.</p><p><strong>Mode 2: Instrumentation</strong> Automated measurement of what actually happened. Response times, token usage, embedding similarities, retrieval success rates. Like application performance monitoring but for AI quality. Runs continuously, at scale.</p><p><strong>Mode 3: Interpretation</strong> Humans make sense of what matters. Not everything can be measured - sometimes you need human judgment on tone, helpfulness, or edge cases. Expensive but irreplaceable for subjective quality and ground truth.</p><h2>What This Looks Like in Practice</h2><p><strong>During Development:</strong></p><ul><li><p>Start with exploration, not test cases - explore what your model can do</p></li><li><p>Capture interesting behaviors as you discover them (the good, bad, and weird)</p></li><li><p>Write expectations, not assertions: "should be empathetic" not "must contain 'sorry'"</p></li><li><p>Build your eval suite backwards - get it working first, then define what "good" means based on real outputs</p></li></ul><p><strong>During Deployment:</strong></p><ul><li><p>Shadow scoring on real traffic before switching</p></li><li><p>Gradual rollout based on confidence thresholds</p></li><li><p>Automatic fallback to previous version if scores drop</p></li><li><p>No binary deploy/rollback - it's a confidence dial</p></li></ul><p><strong>During Operation:</strong></p><ul><li><p>Stream of observations, not error logs</p></li><li><p>Distribution of scores, not pass/fail counts</p></li><li><p>Anomaly detection, not threshold alerts</p></li><li><p>Behavioral drift tracking, not uptime monitoring</p></li></ul><p><strong>During Improvement:</strong></p><ul><li><p>Human labels on strategic samples, not random QA</p></li><li><p>Disagreement cases between evaluators get priority</p></li><li><p>Corrections become test cases automatically</p></li><li><p>Fine-tuning happens on production-validated examples</p></li></ul><h2>Embracing LLM Constraints</h2><p><strong>Embrace Uncertainty:</strong> Stop pretending LLMs are deterministic. Design for confidence intervals, not binary outcomes.</p><p><strong>Compose, Don't Compile:</strong> Evaluators should be Lego blocks - mix and match for your use case. Need safety + factuality + tone? Stack them.</p><p><strong>Optimize for Learning:</strong> Every evaluation should make the system smarter. If it doesn't feed back into improvement, why measure it?</p><p><strong>Human Time is Sacred:</strong> Only escalate to humans when machines disagree or confidence is low. Make their input count by turning it into reusable signals.</p>]]></content:encoded></item><item><title><![CDATA[Prompt Engineering Techniques ]]></title><description><![CDATA[A comprehensive guide to prompt engineering techniques.]]></description><link>https://journal.daniellopes.dev/p/prompt-engineering-techniques</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/prompt-engineering-techniques</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Sun, 22 Dec 2024 01:23:33 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/39f525dd-2098-4f69-bfde-39a74bad7386_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>A few months ago, I shared a summary of papers I was reading at the time&#8212;these were simply my personal notes. This week, while onboarding a new engineer who joined us at <a href="https://growthx.ai/">GrowthX</a>, I decided to create a new summary that is easier to digest and more practical. </p><p>(BTW, <a href="https://growthx.ai/careers">we are hiring!</a>)</p><p>I hop you find this helpful:</p><h2>Zero-shot Prompting</h2><p>Zero-shot prompting allows you to leverage an LLM's capabilities by giving it task instructions without any examples or training data.</p><p>The power of zero-shot prompting lies in its simplicity&#8212;you communicate directly with the model using natural language instructions. For example, instead of showing multiple examples of sentiment analysis, you can simply ask:</p><pre><code><code>Text: i'll bet the video game is a lot more fun than the film.
Sentiment:
</code></code></pre><p>Zero-shot prompting works because LLMs have been trained on vast amounts of data and can understand task contexts from clear instructions alone.</p><h3>When to Use Zero-shot Prompting</h3><p>You should consider zero-shot prompting when:</p><ul><li><p>You need quick results for general tasks</p></li><li><p>The task is relatively straightforward and commonly understood</p></li><li><p>You don't have example data readily available</p></li><li><p>You want to test the model's base capabilities</p></li><li><p>The task doesn't require complex reasoning or multi-step processes</p></li></ul><h3>Benefits and Limitations</h3><p>Benefits:</p><ul><li><p>Simplicity in implementation</p></li><li><p>No need to curate examples</p></li><li><p>Faster execution without example processing</p></li><li><p>Tests the model's true understanding of tasks</p></li><li><p>Useful for rapid prototyping</p></li></ul><p>Limitations:</p><ul><li><p>May be less accurate for complex or specialized tasks</p></li><li><p>Performance can vary significantly between different models</p></li><li><p>Less control over the exact format of outputs</p></li><li><p>May struggle with nuanced or domain-specific tasks</p></li></ul><p>For more complex tasks that require reasoning or specialized knowledge, you might need to consider other techniques like few-shot prompting or chain-of-thought prompting. However, zero-shot prompting serves as an excellent starting point for many applications and can often surprise you with its effectiveness.</p><p>To learn more about zero-shot prompting and its applications, you can explore the detailed research and documentation in <a href="https://lilianweng.github.io/posts/2023-03-15-prompt-engineering/">Lil'Log's Prompt Engineering Guide</a> or dive deeper into implementation strategies at <a href="http://promptingguide.ai/">promptingguide.ai</a>.</p><h2>Few-shot Prompting</h2><p>Few-shot prompting involves providing the language model with examples of the desired task, helping it understand your intentions before handling new cases.</p><h3>Implementation and Best Practices</h3><p>When implementing few-shot prompting, you'll want to structure your prompt with clear example pairs. Here's a basic format:</p><pre><code><code>Input: [First example input]
Output: [First example output]

Input: [Second example input]
Output: [Second example output]

Input: [Your actual input]
Output:
</code></code></pre><p>The effectiveness of few-shot prompting heavily depends on how you select and present your examples. Research has shown that choosing semantically similar examples to your target task can significantly <a href="https://hyperlink.com/">improve performance</a>. You can use embedding-based techniques like k-NN clustering to find relevant examples from your dataset.</p><h3>Example Selection Strategy</h3><p>When selecting examples, consider these key factors:</p><ul><li><p>Choose diverse examples that cover different aspects of the task</p></li><li><p>Ensure examples are representative of the desired output format</p></li><li><p>Keep the number of examples balanced with your token limit</p></li><li><p>Maintain consistent formatting across all examples</p></li></ul><p>Remember that the order of your examples matters. Studies have identified several biases that can affect model performance, including majority label bias (where the model favors more frequently seen labels) and recency bias (where it tends to favor the most recent examples).</p><h3>Benefits and Trade-offs</h3><p>Few-shot prompting offers several advantages over zero-shot approaches:</p><ul><li><p>Provides clearer context for complex tasks</p></li><li><p>Helps establish consistent output formatting</p></li><li><p>Reduces ambiguity in task interpretation</p></li><li><p>Improves performance on specialized or domain-specific tasks</p></li></ul><p>However, these benefits come with trade-offs. Each example consumes tokens from your context window, which can be particularly challenging when working with longer inputs or outputs. You'll need to balance the number of examples against your model's context length limits and cost considerations.</p><p>For optimal results, consider implementing dynamic example selection based on your input. Dynamic example selection, while more complex, can help maximize the relevance of your examples while minimizing token usage. Some researchers have even explored using Q-Learning and active learning techniques to <a href="https://hyperlink.com/">optimize example selection</a> for specific use cases.</p><h2>Chain-of-Thought Prompting</h2><p>Chain-of-Thought (CoT) prompting enhances the reasoning capabilities of large language models by encouraging them to break down complex problems into logical steps.</p><h3>When to Use Chain-of-Thought</h3><p>CoT prompting is particularly effective for tasks requiring multi-step reasoning, such as:</p><ul><li><p>Mathematical word problems</p></li><li><p>Complex logical deductions</p></li><li><p>Symbolic reasoning tasks</p></li><li><p>Decision-making scenarios requiring multiple considerations</p></li></ul><p>The technique works best with larger language models (&gt;100B parameters), as they can better leverage the structured reasoning approach. For simpler tasks or smaller models, traditional prompting might be more appropriate.</p><h3>Implementation and Examples</h3><p>Here's how to implement CoT prompting effectively:</p><pre><code><code>Q: John has 10 apples. He gives away 4 and then receives 5 more. How many apples does he have?

A: Let's solve this step by step:
1. John starts with 10 apples
2. He gives away 4, so 10 - 4 = 6 apples
3. He receives 5 more, so 6 + 5 = 11 apples
Therefore, John has 11 apples.

Q: [Your complex question]
A: Let's solve this step by step:
[Your reasoning steps]
</code></code></pre><p>Research from Wei et al. shows that formatting matters: using newline breaks between reasoning steps performs better than using periods or semicolons. Additionally, using "Question:" instead of "Q:" can improve performance.</p><h3>Best Practices and Limitations</h3><p>To maximize CoT effectiveness:</p><ul><li><p>Break down complex problems into clear, sequential steps</p></li><li><p>Use explicit reasoning markers ("First," "Then," "Therefore")</p></li><li><p>Include diverse examples in your prompts for better generalization</p></li><li><p>Consider using self-consistency sampling to improve accuracy by generating multiple reasoning paths</p></li></ul><p>However, be aware that CoT has limitations:</p><ul><li><p>Less effective with smaller models, which may produce incoherent reasoning</p></li><li><p>Can sometimes generate plausible-sounding but incorrect reasoning chains</p></li><li><p>May not improve performance on simple tasks that don't require multi-step thinking</p></li></ul><p>For complex reasoning tasks, you can enhance CoT further by combining it with other techniques like self-consistency sampling, which generates multiple reasoning paths and selects the most consistent answer through majority voting.</p><h2>Meta Prompting</h2><p>Meta prompting involves guiding the LLM to generate and optimize prompts itself by providing higher-level instructions.</p><h3>How Meta Prompting Works</h3><p>When using meta prompting, you essentially create a prompt about prompts. For example:</p><p>Generate a prompt that would help extract key financial metrics from quarterly reports. The prompt should:</p><ul><li><p>Focus on specific numerical indicators</p></li><li><p>Include validation steps</p></li><li><p>Request structured output in JSON format</p></li></ul><p>The LLM will then generate a task-specific prompt that meets these requirements, which you can use for your actual data processing.</p><h3>Key Applications</h3><p>Meta prompting is particularly valuable in scenarios where:</p><ul><li><p>You need to handle varying input formats or contexts</p></li><li><p>The task requirements might change over time</p></li><li><p>You want to automatically optimize prompts based on results</p></li><li><p>You're building systems that need to adapt to different user needs</p></li></ul><p>For instance, in a document analysis pipeline, you might use meta prompting to generate specialized extraction prompts based on the document type:</p><pre><code><code>Input: Create a prompt for analyzing {document_type}
Context: The system needs to identify {key_elements}
Requirements:

* Maintain consistent output structure
* Include error handling
* Focus on {specific_metrics}
</code></code></pre><h3>Dynamic Prompt Generation</h3><p>One of the most powerful applications is dynamic prompt generation. Instead of using static prompts, you can create meta-level instructions that help the model adjust its approach based on the input:</p><pre><code><code>Based on the user's technical expertise level ({expertise}), generate a prompt that will:

1. Explain {concept} at the appropriate depth
2. Use relevant examples for their field
3. Include follow-up questions tailored to their background
</code></code></pre><h3>Benefits and Considerations</h3><p>Meta prompting offers several advantages:</p><ul><li><p>Increased flexibility in handling diverse use cases</p></li><li><p>Better adaptation to changing requirements</p></li><li><p>More maintainable prompt management</p></li><li><p>Improved scalability for complex applications</p></li></ul><p>However, it's important to note that meta prompting adds a layer of complexity and may require more computational resources since you're essentially running two prompt cycles&#8212;one to generate the prompt and another to execute it.</p><p>When implementing meta prompting, focus on clear constraints and validation criteria to ensure the generated prompts align with your objectives and maintain consistent quality in the final outputs.</p><h2>Self-Consistency</h2><p>Self-consistency enhances the reliability of language models by generating multiple reasoning paths for the same problem and selecting the most consistent answer through majority voting.</p><h3>When to Use Self-Consistency</h3><p>You should consider implementing self-consistency when your tasks involve multi-step reasoning or when there are multiple valid approaches to reach a solution. According to research from Mercity AI, this technique has shown remarkable improvements across various benchmarks:</p><ul><li><p>17.9% improvement on GSM8K (mathematical reasoning)</p></li><li><p>11.0% improvement on SVAMP (word problems)</p></li><li><p>12.2% improvement on AQuA (analytical reasoning)</p></li></ul><p>The benefits become even more pronounced with larger language models, with improvements of up to 23% observed in models like LaMDA137B and GPT-3.</p><h3>Implementation Approach</h3><p>To implement self-consistency effectively:</p><ol><li><p>Generate multiple solutions for the same problem using different reasoning paths.</p></li><li><p>Introduce randomness through various methods:</p><ul><li><p>Altering the order of examples</p></li><li><p>Using model-generated rationales instead of human-written ones</p></li><li><p>Varying the complexity of reasoning chains</p></li></ul></li><li><p>Aggregate the results through majority voting.</p></li></ol><p>When working with training examples, you can follow the STaR (Self-Taught Reasoner) method as outlined by researchers like Lilian Weng:</p><ul><li><p>Generate reasoning chains and retain those leading to correct answers.</p></li><li><p>Fine-tune the model with these generated rationales.</p></li><li><p>Iterate until convergence.</p></li></ul><h3>Performance Considerations</h3><p>Self-consistency is particularly effective because it's an unsupervised technique that requires no additional human annotation, training, or model fine-tuning. It remains robust across different sampling strategies and parameters, consistently enhancing performance.</p><p>For optimal results, consider implementing complexity-based consistency, where you explicitly prefer complex chains among all generations. This approach involves taking majority votes among only the top k complex chains, which has shown to be particularly effective in improving reasoning accuracy.</p><p>The technique becomes increasingly valuable as model size grows. Even for large models that already perform well, self-consistency consistently offers additional gains, with improvements of 12%-18% in accuracy on tasks like AQuA and GSM8K, even when using advanced models like PaLM-540B.</p><h2>Generate Knowledge Prompting</h2><p>Generate Knowledge Prompting is a powerful technique that helps LLMs perform better on tasks requiring deep contextual understanding by first generating relevant knowledge about a topic before attempting to answer questions or complete tasks.</p><h3>How It Works</h3><p>The technique follows a two-step process:</p><ol><li><p>First, prompt the model to generate knowledge about the specific topic.</p></li><li><p>Then, use that generated knowledge to inform the final response.</p></li></ol><p>Here's a basic template for knowledge generation:</p><pre><code><code>Generate some knowledge about the input.

Examples:
Input: What type of water formation is formed by clouds?
Knowledge: Clouds are made of water vapor.

Input: {your_question}
Knowledge:
</code></code></pre><p>This initial knowledge generation step acts as an intermediate reasoning layer, allowing the model to explicitly state relevant facts and context before tackling the main task.</p><h3>Implementation Steps</h3><ol><li><p><strong>Knowledge Generation</strong>: Start by prompting the model to generate relevant knowledge about your topic. This creates an explicit knowledge base for the model to work with.</p></li><li><p><strong>Knowledge Integration</strong>: Feed the generated knowledge back into the prompt along with your main task or question. This ensures the model has immediate access to relevant context.</p></li><li><p><strong>Final Response</strong>: The model then uses both the generated knowledge and the original question to produce a more informed and accurate response.</p></li></ol><h3>Benefits and Applications</h3><p>Generate Knowledge Prompting is particularly effective for:</p><ul><li><p>Complex questions requiring domain expertise</p></li><li><p>Tasks involving temporal knowledge (historical or current events)</p></li><li><p>Scenarios where implicit knowledge needs to be made explicit</p></li><li><p>Cases where the model needs to demonstrate reasoning about specific facts</p></li></ul><p>According to research by Liu et al., this technique has shown significant improvements in response quality even with just "internal retrieval"&#8212;generating knowledge without external sources. The method helps bridge the gap between the model's training data and the specific context needed for accurate responses.</p><p>This approach can be particularly powerful when combined with other techniques like Retrieval Augmented Generation (RAG) for external knowledge sources or Chain-of-Thought prompting for complex reasoning tasks. For instance, you can generate knowledge about multiple aspects of a problem, then use that knowledge to construct a step-by-step solution.</p><p>The technique is especially valuable in production environments where accuracy and reliability are crucial, as it provides an explicit trail of the knowledge being used to form responses, making it easier to verify and debug the model's reasoning process.</p><h2>Prompt Chaining</h2><p>Prompt chaining involves creating a sequence of prompts, where each prompt's output serves as input for the next, effectively breaking down complex tasks into manageable subtasks.</p><h3>When to Use Prompt Chaining</h3><p>You should consider implementing prompt chaining when your task:</p><ul><li><p>Requires multiple logical steps to complete</p></li><li><p>Can be naturally broken down into smaller subtasks</p></li><li><p>Needs intermediate validation or processing</p></li><li><p>Would benefit from focused, specialized prompts rather than one large prompt</p></li></ul><p>For example, instead of asking an LLM to analyze a long document and provide recommendations in a single prompt, you might chain prompts to first summarize the document, then identify key themes, and finally generate specific recommendations based on those themes.</p><h3>Implementation Approach</h3><p>To implement prompt chaining effectively:</p><ol><li><p>Break down your complex task into discrete steps</p></li><li><p>Design specific prompts for each step</p></li><li><p>Create a pipeline where outputs flow as inputs</p></li><li><p>Include validation checks between steps</p></li><li><p>Handle errors and edge cases at each stage</p></li></ol><p>The key is to make each prompt in the chain focused and specific, rather than trying to accomplish everything at once. This improves reliability and makes the system easier to debug and maintain.</p><h3>Benefits of Prompt Chaining</h3><p>Prompt chaining offers several advantages in production environments:</p><ul><li><p>Improved reliability: By breaking down complex tasks, each step becomes more manageable and reliable</p></li><li><p>Better control: You can monitor and validate intermediate results</p></li><li><p>Enhanced debugging: When issues occur, you can identify exactly which step in the chain failed</p></li><li><p>Flexible architecture: Chains can be modified or extended without rebuilding the entire system</p></li><li><p>Reusable components: Individual prompts in the chain can be reused across different workflows</p></li></ul><h3>Real-World Applications</h3><p>In production systems, prompt chaining is particularly valuable for tasks like:</p><ul><li><p>Content generation: Breaking down the process into research, outlining, writing, and editing steps</p></li><li><p>Data analysis: Sequencing data cleaning, analysis, and insight generation</p></li><li><p>Customer service: Routing queries through understanding, context gathering, and response generation</p></li><li><p>Document processing: Implementing staged approaches for extraction, analysis, and summarization</p></li></ul><p>By implementing prompt chaining, you create more robust and maintainable LLM applications that can handle complex tasks with greater reliability and control. The key is to thoughtfully design your chains to match your specific use case requirements while maintaining clear boundaries between each step in the process.</p><h2>Tree of Thoughts</h2><p>Tree of Thoughts (ToT) enables language models to explore multiple potential solutions simultaneously through a branching tree structure, extending beyond the linear reasoning of Chain-of-Thought.</p><h3>How ToT Works</h3><p>ToT breaks down problems into coherent units of text called "thoughts" that serve as intermediate steps. At each step, the model generates multiple possible thoughts, creating branches in a tree-like structure. These branches are then explored using either breadth-first search (BFS) or depth-first search (DFS), with each state being evaluated through classifier prompts or majority voting.</p><p>The key innovation is that ToT allows language models to:</p><ul><li><p>Make deliberate decisions between multiple options</p></li><li><p>Explore different reasoning paths simultaneously</p></li><li><p>Self-evaluate choices at each step</p></li><li><p>Look ahead or backtrack when needed</p></li><li><p>Make globally optimal decisions</p></li></ul><h3>Performance Benefits</h3><p>The effectiveness of ToT is particularly evident in complex problem-solving tasks. For instance, in the Game of 24, ToT achieved a 74% success rate when considering five possible solutions at each step (<em>b=5</em>), compared to just 7.3% for standard input-output approaches. In broader testing across 100 diverse tasks, ToT consistently outperformed traditional methods, achieving an <a href="https://www.mercity.ai/blog-post/advanced-prompt-engineering-techniques">average GPT-4 score of 7.56</a>, compared to 6.19 for standard input-output and 6.93 for Chain-of-Thought approaches.</p><h3>When to Use ToT</h3><p>ToT is particularly effective for:</p><ul><li><p>Problems requiring non-trivial planning or search</p></li><li><p>Tasks with multiple valid solution paths</p></li><li><p>Scenarios where initial approaches might lead to dead ends</p></li><li><p>Complex reasoning that benefits from exploring alternatives</p></li><li><p>Creative tasks requiring evaluation of different possibilities</p></li></ul><p>The technique shines in applications like creative writing, puzzle-solving, and complex mathematical problems where the ability to explore multiple pathways and backtrack when needed leads to more robust solutions.</p><p>For implementation, ToT can be integrated with various search strategies and evaluation methods. The breadth parameter (<em>b</em>) can be adjusted based on the complexity of the task&#8212;simpler problems might work well with <em>b=1</em>, while more complex ones benefit from higher values like <em>b=5</em> to explore more possibilities simultaneously.</p><h2>Retrieval Augmented Generation (RAG)</h2><p>Retrieval Augmented Generation (RAG) enhances Large Language Models (LLMs) by combining document retrieval with answer generation, making it valuable when working with proprietary or dynamic data not part of the model's original training.</p><p>The process works in two distinct phases. First, the system retrieves relevant documents using dense embeddings&#8212;vector representations of both the query and potential source documents. Retrieval can be implemented using various database formats depending on your specific needs, including vector databases, summary indices, tree indices, or keyword table indices. The system identifies the most relevant documents by finding those whose vectors are closest to the query vector in terms of Euclidean distance.</p><p>In the second phase, the LLM generates a response by combining the user's query with the retrieved documents. This approach significantly improves the model's ability to provide accurate, factual responses while reducing hallucination, as it relies on retrieved facts rather than solely on its training data.</p><p>An advanced implementation of RAG is GraphRAG, developed by Microsoft Research, which extends the basic RAG architecture by incorporating knowledge graphs. This enhancement allows the system to connect disparate pieces of information, synthesize insights across multiple sources, understand summarized semantic concepts over large data collections, and combine access to both unstructured and structured data.</p><p>GraphRAG has shown particular effectiveness when working with complex datasets. For instance, it has been successfully applied to the Violent Incident Information from News Articles (VIINA) dataset, demonstrating significant improvements in generating comprehensive and diverse answers for complex analytical questions.</p><p>The key advantage of RAG is its ability to handle information that wasn't included in the model's initial training or fine-tuning phases. This makes it especially valuable for enterprise applications where you need to work with internal documentation, frequently updated information, proprietary data, and domain-specific knowledge.</p><p>When implementing RAG, you can leverage different types of storage systems based on your specific needs. The choice between vector databases, summary indices, or tree indices will depend on factors like your data structure, query patterns, and performance requirements.</p><h2>Automatic Reasoning and Tool-use</h2><p>Automatic reasoning and tool-use combines an LLM's reasoning capabilities with the ability to interact with external tools and APIs, enabling it to make decisions and take actions.</p><h3>When to Use Automatic Reasoning and Tool-use</h3><p>You should consider implementing automatic reasoning and tool-use when your application needs to:</p><ul><li><p>Solve complex problems that require multiple steps of logical reasoning</p></li><li><p>Access external data or functionality not contained within the LLM's knowledge</p></li><li><p>Generate action plans and execute them through external tools</p></li><li><p>Automate workflows that combine decision-making with practical actions</p></li></ul><h3>Integration with External Tools</h3><p>The implementation typically involves creating a framework where the LLM can:</p><ol><li><p>Analyze the task and break it down into logical steps</p></li><li><p>Identify which tools or APIs are needed for each step</p></li><li><p>Generate the appropriate calls to these tools</p></li><li><p>Process the results and incorporate them into its reasoning chain</p></li></ol><p>While specific implementations vary, the general pattern involves providing the LLM with:</p><ul><li><p>A description of available tools and their capabilities</p></li><li><p>The format for tool invocation</p></li><li><p>How to interpret and use tool responses</p></li></ul><h3>Applications and Benefits</h3><p>This technique is particularly valuable in scenarios such as:</p><ul><li><p>Automated research workflows where the LLM needs to query databases or search engines</p></li><li><p>Data analysis pipelines where the model must process information from multiple sources</p></li><li><p>Task automation systems that need to interact with various APIs</p></li><li><p>Decision support systems that combine reasoning with real-world data</p></li></ul><p>The key advantage is the ability to extend the LLM's capabilities beyond its training data by connecting it to external tools and data sources. This creates more powerful and practical applications that can take real actions based on reasoned decisions.</p><h3>Integration with Other Techniques</h3><p>Automatic reasoning and tool-use often works in conjunction with other prompt engineering techniques. For example, it can be combined with ReAct (Reasoning and Acting) to create systems that can both reason about problems and take appropriate actions to solve them. This combination enhances the LLM's ability to handle complex tasks by breaking them down into manageable steps of reasoning and action.</p><p>The future development of this technique points toward more sophisticated frameworks that can automatically determine which tools to use and when, making AI systems more autonomous and capable of handling complex real-world tasks that require both reasoning and practical action.</p><h2>Active-Prompt</h2><p>Active-Prompt adapts and refines prompts in real-time based on the model's responses and performance, continuously optimizing output quality in production environments.</p><h3>Dynamic Adaptation Mechanism</h3><p>The core of Active-Prompt lies in its feedback loop system:</p><ol><li><p>Initial prompt execution</p></li><li><p>Response evaluation</p></li><li><p>Prompt refinement based on performance metrics</p></li><li><p>Iterative improvement through continuous monitoring</p></li></ol><p>This mechanism is particularly valuable when working with LLM applications that need to maintain high performance and reliability in production environments. The prompt adjustments can be based on various factors:</p><ul><li><p>Response quality metrics</p></li><li><p>Context adherence</p></li><li><p>Instruction following accuracy</p></li><li><p>Error rates and completion success</p></li><li><p>User interaction patterns</p></li></ul><h3>Implementation Approach</h3><p>To implement Active-Prompt effectively, you'll need to:</p><ol><li><p>Set up monitoring infrastructure to track prompt performance</p></li><li><p>Define clear evaluation metrics for response quality</p></li><li><p>Create adjustment rules for prompt modification</p></li><li><p>Implement feedback mechanisms for continuous improvement</p></li></ol><p>Here's a conceptual example of how to structure an Active-Prompt system:</p><pre><code><code>class ActivePromptSystem:
    def __init__(self, base_prompt, evaluation_metrics):
        self.current_prompt = base_prompt
        self.metrics = evaluation_metrics
        self.performance_history = []

    def evaluate_response(self, response):
        score = self.metrics.evaluate(response)
        self.performance_history.append(score)
        return score

    def adjust_prompt(self, score):
        if score &lt; self.metrics.threshold:
            # Implement prompt refinement logic
            self.current_prompt = self.refine_prompt(score)
            
    def execute_with_monitoring(self, input_text):
        response = self.execute_prompt(self.current_prompt, input_text)
        score = self.evaluate_response(response)
        self.adjust_prompt(score)
        return response
</code></code></pre><h3>Benefits for Production Systems</h3><p>Active-Prompt offers several advantages for production LLM applications:</p><ul><li><p>Real-time quality maintenance through continuous monitoring</p></li><li><p>Automatic adaptation to changing conditions or requirements</p></li><li><p>Reduced need for manual prompt engineering interventions</p></li><li><p>Improved reliability in production environments</p></li><li><p>Better handling of edge cases through dynamic adjustments</p></li></ul><p>The technique works particularly well when integrated with comprehensive monitoring systems that can track and analyze prompt performance metrics in real-time. This aligns with production requirements where maintaining consistent quality and reliability is crucial for mission-critical applications.</p><p>For optimal results, Active-Prompt should be combined with robust observability tools that can provide detailed insights into prompt performance and help identify areas needing adjustment. This enables both automated and manual refinements to the prompting strategy based on real-world usage patterns and performance data.</p><h2>Directional Stimulus Prompting</h2><p>Directional Stimulus Prompting (DSP) incorporates specific hints, cues, or keywords to guide the language model toward producing desired outputs in alignment with your goals.</p><h3>How It Works</h3><p>The core mechanism of DSP involves embedding strategic cues within your prompt that serve as "stimuli" for the model. These cues can take various forms:</p><ul><li><p>Keyword hints that should appear in the response</p></li><li><p>Structural elements that guide the format</p></li><li><p>Contextual signals that frame the desired perspective</p></li><li><p>Specific terminology that should be incorporated</p></li></ul><p>For example, instead of asking "Write about artificial intelligence," you might use DSP like this:</p><pre><code><code>Write about artificial intelligence, incorporating these key aspects:
- Neural networks
- Machine learning algorithms
- Real-world applications
Focus on enterprise implementations and emphasize scalability.
</code></code></pre><p>The directional elements in this prompt ("enterprise implementations" and "scalability") help steer the model toward a specific type of response while the keyword hints ensure coverage of essential concepts.</p><h3>When to Use DSP</h3><p>DSP is particularly valuable when you need:</p><ul><li><p>Precise control over the model's output format</p></li><li><p>Consistent inclusion of specific elements or terminology</p></li><li><p>Alignment with domain-specific requirements</p></li><li><p>Structured responses that follow particular patterns</p></li><li><p>Outputs that maintain focus on certain aspects while avoiding others</p></li></ul><p>This technique shines in professional contexts where output consistency and adherence to specific requirements are crucial. For instance, when generating technical documentation, product descriptions, or specialized reports where certain elements must be present in the final output.</p><h3>Benefits and Considerations</h3><p>The primary advantage of DSP is its ability to provide finer control over the model's outputs while maintaining natural language flow. Research indicates that this approach helps improve the reliability and specificity of responses while reducing the likelihood of off-topic or irrelevant content.</p><p>However, it's important to balance directional elements with enough flexibility for the model to leverage its capabilities. Over-constraining the prompt with too many directional stimuli can lead to rigid or artificial-sounding outputs.</p><p>When implementing DSP, focus on:</p><ul><li><p>Using clear, unambiguous directional cues</p></li><li><p>Maintaining a natural flow despite the embedded stimuli</p></li><li><p>Balancing guidance with creative freedom</p></li><li><p>Testing different combinations of directional elements to find optimal results</p></li></ul><p>By thoughtfully incorporating directional stimuli into your prompts, you can achieve more predictable and targeted outputs while maintaining the natural language capabilities of the model.</p><h2>ReAct: Combining Reasoning and Action</h2><p>ReAct (Reasoning and Acting) combines verbal reasoning with action generation in language models, enabling them to think through complex problems and take specific actions.</p><h3>How ReAct Works</h3><p>The ReAct framework operates by prompting language models to generate two key components:</p><ul><li><p>Verbal reasoning traces that show the model's thought process</p></li><li><p>Specific actions based on that reasoning</p></li></ul><p>This dual approach enables dynamic reasoning and high-level planning while allowing interaction with external environments. For example, when solving a complex question, ReAct will first reason about the necessary steps, then take actions to gather information, and finally synthesize the answer.</p><h3>Performance and Benefits</h3><p>ReAct has demonstrated significant improvements across various benchmarks:</p><ul><li><p><a href="https://www.mercity.ai/blog-post/advanced-prompt-engineering-techniques">34% performance improvement on ALFWorld</a> for text-based game navigation</p></li><li><p>10% improvement on WebShop for web page navigation tasks</p></li><li><p>Superior performance on question answering (HotPotQA) and fact verification (Fever) tasks</p></li></ul><p>The framework offers several key advantages:</p><ul><li><p>Reduces hallucination by grounding reasoning in specific actions</p></li><li><p>Prevents error propagation in chain-of-thought reasoning</p></li><li><p>Provides interpretable decision-making processes</p></li><li><p>Allows human inspection and correction during task execution</p></li><li><p>Maintains robustness across diverse tasks</p></li></ul><h3>Applications</h3><p>You should consider using ReAct when your application requires:</p><ul><li><p>Complex question answering that needs multiple steps</p></li><li><p>Fact verification tasks requiring evidence gathering</p></li><li><p>Navigation of text-based environments or web interfaces</p></li><li><p>Tasks requiring both reasoning and interaction with external tools</p></li><li><p>Scenarios where you need to inspect or control the model's behavior</p></li></ul><p>ReAct's design makes it particularly effective for tasks that combine reasoning with real-world interactions, making it a valuable tool for building more capable and reliable AI systems. The framework's ability to generate both reasoning traces and actions makes it especially useful in production environments where transparency and control are crucial.</p><h2>Reflexion</h2><p>Reflexion implements a feedback loop mechanism, allowing language models to learn from their own outputs and improve through iteration by maintaining an episodic memory buffer of self-reflections.</p><h3>How Reflexion Works</h3><p>The Reflexion framework operates through a three-step cycle:</p><ol><li><p>The model attempts to solve the given task</p></li><li><p>It generates verbal self-reflections about its performance</p></li><li><p>These reflections are stored in memory and used to inform future attempts</p></li></ol><p>What makes Reflexion particularly versatile is its ability to incorporate various types of feedback signals. These can range from scalar values (like rewards or punishments) to free-form language feedback, and can come from either external sources (humans or other agents) or be internally generated by the model itself.</p><h3>Performance Benefits</h3><p>The effectiveness of Reflexion has been demonstrated across multiple domains:</p><ul><li><p>In decision-making tasks (AlfWorld), implementations showed a <a href="https://www.mercity.ai/blog-post/advanced-prompt-engineering-techniques">22% improvement</a> over 12 iterative learning steps</p></li><li><p>For reasoning questions (HotPotQA), <a href="https://www.mercity.ai/blog-post/advanced-prompt-engineering-techniques">accuracy increased by 20%</a></p></li><li><p>Most impressively, in Python programming tasks (HumanEval), Reflexion achieved a <a href="https://www.mercity.ai/blog-post/advanced-prompt-engineering-techniques">91% pass@1 accuracy</a>, significantly outperforming GPT-4's 80% baseline</p></li></ul><h3>Implementation Example</h3><p>Here's how you might structure a basic Reflexion prompt:</p><pre><code><code>Task: [Your specific task]
Previous Attempt: [Model's last solution]
Reflection: Let's analyze the previous attempt:
1. What worked well?
2. What could be improved?
3. What should we do differently?

New Solution:
[Model generates improved solution based on reflection]
</code></code></pre><p>The key to successful Reflexion implementation is maintaining a clear record of previous attempts and their corresponding reflections. This allows the model to build upon its experiences and avoid repeating past mistakes, creating a continuous improvement cycle that leads to increasingly refined outputs.</p><p>The technique is particularly effective for complex tasks requiring multiple iterations or when initial solutions might be suboptimal. By incorporating feedback and self-reflection, Reflexion provides a structured way for language models to refine their responses and achieve better outcomes through systematic improvement.</p><h2>Multimodal Chain-of-Thought (CoT)</h2><p>Multimodal Chain-of-Thought (CoT) extends traditional CoT prompting to handle tasks that combine text with other media types, enabling AI systems to reason about relationships between different modalities.</p><h3>Understanding Image-Text Coherence</h3><p>When implementing multimodal CoT, you need to consider different types of coherence relations between images and text. According to <a href="https://www.frontiersin.org/articles/10.3389/frai.2023.1048874">recent research</a>, these relations can be categorized into:</p><ul><li><p>Visible relations: Direct descriptions of image content</p></li><li><p>Action relations: Descriptions of events or actions shown</p></li><li><p>Subjective relations: Evaluations or reactions to content</p></li><li><p>Story relations: Background context or narrative</p></li><li><p>Meta relations: Technical or contextual information about the image</p></li></ul><p>Understanding these relations is crucial because they form the foundation of how your prompts should guide the model's reasoning process across modalities.</p><h3>Benefits and Performance Impact</h3><p>Multimodal CoT builds on the impressive performance gains seen in traditional CoT implementations. For example, research shows that in large models like PaLM 540B, CoT prompting improves performance by:</p><ul><li><p>24% on mathematical reasoning tasks (SVAMP)</p></li><li><p>35% on symbolic reasoning tasks</p></li><li><p>19% on complex word problems (GSM8K)</p></li></ul><p>When applied to multimodal tasks, these benefits extend to:</p><ul><li><p>More accurate image-text relationship understanding</p></li><li><p>Better contextual reasoning about visual elements</p></li><li><p>Improved ability to generate coherent explanations about visual content</p></li></ul><h3>Implementation Considerations</h3><p>To effectively implement multimodal CoT, you should:</p><ol><li><p>Use models of sufficient size (&gt;100B parameters) as smaller models may produce illogical chains</p></li><li><p>Structure your prompts to explicitly address different coherence relations</p></li><li><p>Include reasoning steps that bridge visual and textual elements</p></li><li><p>Encourage the model to articulate its observations and conclusions about visual content</p></li></ol><p>Remember that multimodal CoT requires careful prompt design to maintain coherence across modalities. Your prompts should guide the model to explain its reasoning about visual elements while maintaining logical connections to any textual context or requirements.</p><h2>Graph Prompting</h2><p>Graph prompting leverages knowledge graphs to enhance the capabilities of large language models, making it valuable for handling complex, interconnected information or structured reasoning.</p><h3>When to Use Graph Prompting</h3><p>You should consider graph prompting when your tasks involve:</p><ul><li><p>Complex relational reasoning between different pieces of information</p></li><li><p>Need for comprehensive understanding across large datasets</p></li><li><p>Requirements for connecting disparate information sources</p></li><li><p>Structured knowledge representation and querying</p></li></ul><p>Graph prompting excels in scenarios where traditional prompting methods might miss important connections or fail to capture the full context of related information. By incorporating knowledge graphs into your prompting strategy, you can enable the model to navigate through interconnected concepts and relationships more effectively.</p><h3>Implementation and Benefits</h3><p>The implementation of graph prompting typically combines LLM-generated knowledge graphs with graph machine learning techniques. This approach offers several key advantages:</p><ul><li><p>Enhanced context understanding: The model can better understand relationships between different pieces of information by visualizing them as connected nodes in a graph.</p></li><li><p>Improved answer generation: By leveraging the graph structure, responses can be more comprehensive and diverse, especially for complex queries.</p></li><li><p>Better information synthesis: The ability to connect multiple data points helps in generating more insightful and holistic answers.</p></li></ul><p>Microsoft Research has demonstrated the effectiveness of this approach through <a href="https://www.microsoft.com/en-us/research/project/graphrag/">GraphRAG</a>, which extends Retrieval Augmented Generation (RAG) with knowledge graph capabilities. This implementation has shown particular success with complex datasets like the Violent Incident Information from News Articles (VIINA), where understanding relationships between different events and entities is crucial.</p><h3>Advanced Applications</h3><p>Graph prompting can be particularly powerful when:</p><ul><li><p>Working with both structured and unstructured data simultaneously</p></li><li><p>Requiring improved ranking and relevance in information retrieval</p></li><li><p>Needing to maintain consistency across related pieces of information</p></li><li><p>Handling complex query chains that involve multiple steps of reasoning</p></li></ul><p>One of the most significant advantages is the ability to perform text-to-query generation while maintaining the context of the broader knowledge structure. This enables more accurate and contextually relevant responses, especially in domains where relationships between different pieces of information are crucial for understanding.</p><p>By incorporating graph-based structures into your prompting strategy, you can create more sophisticated and capable AI applications that better handle complex, interconnected information while maintaining contextual accuracy and relevance.</p>]]></content:encoded></item><item><title><![CDATA[Joining GrowthX]]></title><description><![CDATA[We hiring founding engineers]]></description><link>https://journal.daniellopes.dev/p/joining-growthx</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/joining-growthx</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Thu, 19 Dec 2024 16:46:57 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I'm thrilled to share some exciting news: I've joined GrowthX.ai.</p><p>What we&#8217;ve achieved in just ~7 months since inception is remarkable. $3M+ in ARR. Profitable. Grew team to 15. Zero funding!</p><p>Here&#8217;s the backstory:</p><p>At the beginning of the year, I took a sabbatical to immerse myself in the world of large language models (LLMs). I dove deep into retrieval-augmented generation (RAG), prompting strategies, fine-tuning techniques, and more. </p><p>Fast forward to mid-September, I got introduced to <a href="https://www.linkedin.com/in/marcelsantilli/">Marcel Santilli</a>. Within minutes, it was clear&#8212;he was onto something big and had the experience + ambition to deliver on it.<br><br>Marcel's pitch:</p><ul><li><p>Brands need to become publishers.</p></li><li><p>Founders need to become influencers.</p></li><li><p>You need strategy + experts to deliver it, but AI can make it scalable.</p></li></ul><p>Having previously helped with marketing before, I immediately got it&#8212;and Marcel had already proven the model&#8217;s effectiveness. The technical and product needs for the business were exactly the kinds of things I had spent the last year+ studying.</p><p>In the past four months, we&#8217;ve spent long hours validating a roadmap, getting the right team structure, and defining the GTM. Now we have the ambitious goal of reaching $15 million in ARR by 2025. To achieve this, my top priorities are building the right infra and putting together a great technical team.</p><p>We just opened two roles for founding engineers: <a href="https://growthx.ai/careers">https://growthx.ai/careers</a>. We are a remote-first company hiring from anywhere, as long as we can have a 4-hour timezone overlap with San Francisco. If you&#8217;re interested, don&#8217;t hesitate to reach out!</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!FupV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!FupV!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png 424w, https://substackcdn.com/image/fetch/$s_!FupV!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png 848w, https://substackcdn.com/image/fetch/$s_!FupV!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png 1272w, https://substackcdn.com/image/fetch/$s_!FupV!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!FupV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png" width="1203" height="518" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:518,&quot;width&quot;:1203,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:25944,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!FupV!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png 424w, https://substackcdn.com/image/fetch/$s_!FupV!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png 848w, https://substackcdn.com/image/fetch/$s_!FupV!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png 1272w, https://substackcdn.com/image/fetch/$s_!FupV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa71ca141-3e0d-47cd-a2fc-78d89c581555_1203x518.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p>]]></content:encoded></item><item><title><![CDATA[One Easy Step to Better AI-Generated Text]]></title><description><![CDATA[&#8216;leverage,&#8217; &#8216;harness,&#8217; &#8216;elevate,&#8217; &#8216;ignite,&#8217; &#8216;empower,&#8217; &#8216;cutting-edge,&#8217; &#8216;unleash,&#8217; &#8216;revolutionize,&#8217; &#8216;innovate,&#8217; &#8216;dynamic,&#8217; &#8216;transformative power&#8217; &#128529;]]></description><link>https://journal.daniellopes.dev/p/one-easy-step-to-better-ai-generated</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/one-easy-step-to-better-ai-generated</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Mon, 14 Oct 2024 21:38:46 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!9Wh8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9Wh8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9Wh8!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png 424w, https://substackcdn.com/image/fetch/$s_!9Wh8!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png 848w, https://substackcdn.com/image/fetch/$s_!9Wh8!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png 1272w, https://substackcdn.com/image/fetch/$s_!9Wh8!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9Wh8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png" width="1186" height="422" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:422,&quot;width&quot;:1186,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:113957,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9Wh8!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png 424w, https://substackcdn.com/image/fetch/$s_!9Wh8!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png 848w, https://substackcdn.com/image/fetch/$s_!9Wh8!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png 1272w, https://substackcdn.com/image/fetch/$s_!9Wh8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe5f9ab1e-7fa6-46d2-b171-d704faea0a6f_1186x422.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>We all know that LLMs struggle with writing unless you put in effort to improve their output. To save our eyeballs when reading AI-generated content, add this to every writing request you make to an LLM:</p><pre><code>## Style guideline:

Avoid overused buzzwords (like &#8216;leverage,&#8217; &#8216;harness,&#8217; &#8216;elevate,&#8217; &#8216;ignite,&#8217; &#8216;empower,&#8217; &#8216;cutting-edge,&#8217; &#8216;unleash,&#8217; &#8216;revolutionize,&#8217; &#8216;innovate,&#8217; &#8216;dynamic,&#8217; &#8216;transformative power&#8217;), filler phrases (such as &#8216;in conclusion,&#8217; &#8216;it&#8217;s important to note,&#8217; &#8216;as previously mentioned,&#8217; &#8216;ultimately,&#8217; &#8216;to summarize,&#8217; &#8216;what&#8217;s more,&#8217; &#8216;now,&#8217; &#8216;until recently&#8217;), clich&#233;s (like &#8216;game changer,&#8217; &#8216;push the boundaries,&#8217; &#8216;the possibilities are endless,&#8217; &#8216;only time will tell,&#8217; &#8216;mind-boggling figure,&#8217; &#8216;breaking barriers,&#8217; &#8216;unlock the potential,&#8217; &#8216;remarkable breakthrough&#8217;), and flowery language (including &#8216;tapestry,&#8217; &#8216;whispering,&#8217; &#8216;labyrinth,&#8217; &#8216;oasis,&#8217; &#8216;metamorphosis,&#8217; &#8216;enigma,&#8217; &#8216;gossamer,&#8217; &#8216;treasure trove,&#8217; &#8216;labyrinthine&#8217;). Also, limit the use of redundant connectives and fillers like &#8216;moreover,&#8217; &#8216;furthermore,&#8217; &#8216;additionally,&#8217; &#8216;however,&#8217; &#8216;therefore,&#8217; &#8216;consequently,&#8217; &#8216;importantly,&#8217; &#8216;notably,&#8217; &#8216;as well as,&#8217; &#8216;despite,&#8217; &#8216;essentially,&#8217; and avoid starting sentences with phrases like &#8216;Firstly,&#8217; &#8216;Moreover,&#8217; &#8216;In today&#8217;s digital era,&#8217; &#8216;In the world of&#8217;. Focus on delivering the information in a concise and natural tone without unnecessary embellishments, jargon, or redundant phrases.</code></pre><p></p>]]></content:encoded></item><item><title><![CDATA[7-Powers book notes]]></title><description><![CDATA[While mentoring the current Techstars cohort that I&#8217;m working with, I see myself mentioning Hamilton Helmer's book over and over.]]></description><link>https://journal.daniellopes.dev/p/7-powers-book-summary</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/7-powers-book-summary</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Thu, 19 Sep 2024 02:32:18 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!e3KM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>While mentoring the current Techstars cohort that I&#8217;m working with, I see myself mentioning <a href="https://www.amazon.com/7-Powers-Foundations-Business-Strategy/dp/0998116319">Hamilton Helmer's book</a> over and over. So here&#8217;s a quick a summary:</p><h1>Summary</h1><p>The main idea is that every single company that doesn't rely on cheating (corruption, government ties, etc.) to come to life and achieve long-term success (measured in profit/revenue) has to have at least one or more of these "powers":</p><ol><li><p>Scale Economies</p></li><li><p>Network effect</p></li><li><p>Counter positioning</p></li><li><p>Branding</p></li><li><p>Switching costs</p></li><li><p>Cornered resources</p></li><li><p>Process</p></li></ol><h2>Power progression</h2><p>The idea is that these powers aren't constant either, and especially in tech, things are much more dynamic than static. Just because you succeed with one of these powers doesn't mean it will sustain forever, and each power has different defensibility in the face of competition as well. He qualifies the impact and the moment you can rely on and try to chase each power based on the phases of the business:</p><ol><li><p>Origination</p></li><li><p>Take-off phase</p></li><li><p>Stability phase</p></li></ol><p>Each phase is self-explanatory, but his idea is that chasing specific barriers in the right phase of the business will build much higher competitive advantage and barriers for new players. Also, he is explicitly focused on business phases, not product life-cycle.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!e3KM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!e3KM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg 424w, https://substackcdn.com/image/fetch/$s_!e3KM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg 848w, https://substackcdn.com/image/fetch/$s_!e3KM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!e3KM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!e3KM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg" width="684" height="529" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:529,&quot;width&quot;:684,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!e3KM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg 424w, https://substackcdn.com/image/fetch/$s_!e3KM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg 848w, https://substackcdn.com/image/fetch/$s_!e3KM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!e3KM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccabf69a-e56a-4912-9577-cd40cb789a6b_684x529.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>Powers applicable to Origination Phase</h3><p>Phase where you are between the creation of the business, building everything and just launched and/or haven't achieved growth or market fit yet.</p><ul><li><p>Counter positioning</p></li><li><p>Cornered resources</p></li></ul><h3>Applicable to Take-off Phase</h3><p>Take-off phase is when the business starts to achieve fast growth.</p><ul><li><p>Scale economies</p></li><li><p>Switching Costs</p></li><li><p>Network effect</p></li></ul><h3>Applicable to Stability phase</h3><p>The break between take-off and stability is when unit growth falls below about 30%-40% year.</p><ul><li><p>Branding</p></li><li><p>Process power</p></li></ul><h2>Now to the description and examples of each power (in order of business phases):</h2><h3>Counter positioning (origination)</h3><p>Can be identified by:</p><ul><li><p>An upstart who developed a superior business model.</p></li><li><p>That business model's ability to successfully challenge well-entrenched incumbents.</p></li><li><p>Steady accumulation of customers, all while incumbent remains paralyzed and unable to respond.</p></li></ul><p><strong>Ex</strong>: The new business model is superior to the incumbent's model due to lower costs and/or the ability to charge higher prices.</p><h4>Barrier of entrance:</h4><p>Incumbent(s) will eventually ask: "Am I better off staying the course, or adopting the new model?" If the answer is "stay off", then the barrier is high. A newcomer adopts a new, superior business model which the incumbent does not mimic due to anticipated damage to their existing business.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ryf8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ryf8!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ryf8!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ryf8!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ryf8!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ryf8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg" width="684" height="463" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:463,&quot;width&quot;:684,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ryf8!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ryf8!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ryf8!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ryf8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F27ef931d-d247-4d9f-8155-27abaec765d3_684x463.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h4>Counter-Positioning versus Disruptive Technologies (Clayton Christensen)</h4><ul><li><p>Digital photography vs Kodak - this is a DT, but not CP.</p></li><li><p>In-N-Out vs. McDonald's - this is CP, but not DT</p></li><li><p>Netflix streaming vs. HBO via cable - this is both CP and DT</p></li></ul><p>The case for playing humble: Cognitive Bias can play a role in deterring the incumbent. But the challenger, by its posture, may be able to influence such a move. How to attempt this? In its ascendancy, the challenger should avoid the temptation of trumpeting its superiority, instead suppressing that urge and adopting a tone of respect toward the incumbent.</p><p>Counter positioning is not exclusive. Different from other powers like Network Effect, strong counter positioning doesn't require (or create) a winner-takes-all situation.</p><h4>Five stages of the incumbent when dealing with successful counter positioning from a new challenger:</h4><ol><li><p>Denial (aka, "we are not competitors")</p></li><li><p>Ridicule (aka, "we are better")</p></li><li><p>Fear</p></li><li><p>Anger</p></li><li><p>Capitulation (frequently too late)</p></li></ol><p>Blockbuster execs showed all 5 stages in their public statements when facing Netflix. I personally dealt with founders on 1 and 2 many times (and always end up on 5 later).</p><p>Once market erosion becomes severe, a Counter-Positioned incumbent comes under tremendous pressure to do something; at the same time, they face great pressure to not upset the apple cart of the legacy business model. A frequent outcome of this duality? Let's call it dabbling: the incumbent puts a toe in the water, somehow, but refuses to commit in a way that meaningfully answers the challenge.</p><p>CP often underlies situations in which the following happens at the same time:</p><p>For the challenger:</p><ul><li><p>Rapid share gains</p></li><li><p>Strong profitability (or promise of it)</p></li></ul><p>For the incumbent:</p><ul><li><p>Share loss</p></li><li><p>Inability to counter the entrant's moves</p></li><li><p>Eventually management shake-up(s)</p></li><li><p>Capitulation</p></li></ul><div><hr></div><h3>Cornered resources (origination)</h3><p><strong>Benefit</strong>:</p><p>Cornered Resource can emerge in varied forms, offering uniquely different benefits. It might, for example, be preferential access to a valuable patent, such as that for a blockbuster drug; a required input, such as a cement producer's ownership of a nearby limestone source, or a cost-saving production manufacturing approach.</p><p><strong>Ex</strong>: In the Pixar case, the Brain Trust produced an uncommonly appealing product&#8212;"superior deliverables"&#8212;driving demand with very attractive price/volume combinations in the form of huge box office returns.</p><p><strong>Barrier</strong>:</p><p>Barrier in Cornered Resource is unlike anything we have encountered before. You might wonder: "Why does Pixar retain the Brain Trust?" Any one of this group would be highly sought after by other animated film companies, and yet over this period, and no doubt into the future, they have stayed with Pixar. In the case of spin casting technology, it is patent law, and in the case of cement inputs, it is property rights.</p><h4>The Five Tests to validate Cornered Resource:</h4><ol><li><p>Individual/Exclusive: If a firm repeatedly acquires coveted assets at attractive terms, then the proper strategy question is, "Why are they able to do this?" Ex: if Exxon was able to persistently gain the rights to desirable properties, then understanding their path to that is key. Do relative scale allows them to develop better discovery processes? If so, their discovery processes are the Cornered Resource.</p></li><li><p>Non-arbitraged: cost to keep the resource should be low enough to afford consistent differential returns (ex: a movie hiring Brad Pitt might move the box office but won't necessarily 10x the cost).</p></li><li><p>Transferable: If a resource creates value at a single company but would fail to do so at other companies, then isolating that resource as the source of Power would entail overlooking some other essential complement beyond operational excellence.</p></li><li><p>Ongoing: In searching for Power, a strategist tries to isolate a causal factor that explains continued differential returns. Ex. Post-it notes patent lasted for 30 years</p></li><li><p>Sufficient: The final Cornered Resource test concerns completeness: for a resource to qualify as Power, it must be sufficient for continued differential returns, assuming operational excellence. Ex. George Fisher joining Kodak wasn't enough.</p></li></ol><div><hr></div><h3>Switching costs (take-off phase)</h3><p><strong>Ex</strong>: SAP - 43% of users are unhappy but 85% say they will continue to pay for the product, and the sales validate their statements.</p><p><strong>Benefit</strong>:</p><p>A company that has embedded Switching Costs for its current customers can charge higher prices than competitors for equivalent products or services. Benefit only possible when you have additional offerings to sell.</p><p><strong>Barrier</strong>:</p><p>To offer an equivalent product, competitors must compensate customers for Switching Costs. The firm that has previously roped in the customer, then, can set or adjust prices in a way that puts their potential rival at a cost disadvantage, rendering such a challenge distinctly unattractive.</p><h4>Types of switching costs:</h4><ul><li><p>Financial: $ cost of investing and purchasing new solutions (ex: SAP)</p></li><li><p>Procedural: loss of familiarity or risk of adoption of a new product (ex: "nobody was ever fired for choosing IBM")</p></li><li><p>Relational: Affection with the provider (ex: sales team) or product/identity as a user.</p></li></ul><div><hr></div><h3>Network Effect (take-off phase)</h3><p><strong>Benefit</strong>:</p><p>Network Economies occur when the value of a product to a customer is increased by the use of the product by others (Branch Out vs LinkedIn)</p><p>A company in a leadership position with Network Economies can charge higher prices than its competitors, because of the higher value as a result of more users.</p><p><strong>Barrier</strong>:</p><p>For Network Economies, the barrier is the unattractive cost/benefit of gaining share, and this can be extremely high. In particular, the value deficit of a follower can be so large that the price discount needed to offset this is unthinkable.</p><p><strong>Example</strong>: The value of LinkedIn's HR Solutions Suite comes from the numbers of LinkedIn users, so LinkedIn can charge more than a competing product with fewer participants.</p><h4>Characteristics:</h4><p>Industries exhibiting Network Economies often exhibit these attributes:</p><ol><li><p>Winner take all: Businesses with strong Network Economies are frequently characterized by a tipping point: once a single firm achieves a certain degree of leadership, then the other firms just throw in the towel. Game over&#8212;the P&amp;L of a challenge would just be too ugly. For example, even a company as competent and with as deep pockets as Google could not unseat Facebook with Google+.</p></li><li><p>Boundedness: As powerful as this Barrier is, it is bounded by the character of the network, something well-demonstrated by the continued success of both Facebook and LinkedIn. Facebook has powerful Network Economies itself but these have to do with personal not professional interactions. The boundaries of the network effects determine the boundaries of the business.</p></li><li><p>Decisive early product: Due to tipping point dynamics, early relative scaling is critical in developing Power. Who scales the fastest is often determined by who gets the product most right early on. Facebook's trumping of MySpace is a good example.</p></li></ol><div><hr></div><h3>Scale Economies (take-off)</h3><p>The quality of declining unit costs with increased business size is referred to as Scale Economies.</p><p><strong>Benefit</strong>:</p><p>Some condition which yields material improvement in the cash flow of the Power wielder via reduced cost, enhanced pricing and/or decreased investment requirements.</p><p><strong>Barrier</strong>:</p><p>Some obstacle which engenders in competitors an inability and/or unwillingness to engage in behaviors that might, over time, arbitrage out this benefit.</p><p>For Scale Economies, the Benefit is straightforward: lowered costs. In the case of Netflix, their lead in subscribers translated directly into lower content costs per subscriber for originals and exclusives.</p><p><strong>Example</strong>:</p><p>Netflix paid $100M for House of Cards and their streaming business had 30M customers, then the cost per customer was three dollars and change. In this scenario, a competitor with only one million subscribers would have to ante up $100 per subscriber.</p><p>This situation creates a very difficult position for Netflix's smaller-scale streaming competitors. If they offer the same deliverable as Netflix, similar amounts of content for the same price, their P&amp;L will suffer. If they try to remediate this by offering less content or raising prices, customers will abandon their service and they will lose market share. Such a competitive cul-de-sac is the hallmark of Power.</p><div><hr></div><h3>Branding (stability phase)</h3><p>Branding as power to charge higher prices and sustain market share.</p><p><strong>Ex</strong>: Purchased a Diamond ring at Tiffany for $16,600 and one of similar size and cut at Costco for $6,600, then asked a reputable gemologist and appraiser to assess the rings' values: Costco ring at $8,000 plus setting costs, more than $2,000 above the selling price. Assessed Tiffany ring at $10,500 plus setting costs at a non-brand-name retailer.</p><p>"You got exactly what they said you were getting. Anything that is brand-name and has developed a reputation that Tiffany has developed, they've earned it over the years for quality control. You can go there [and] you don't have to think twice about your purchase. And you pay for that."</p><p><strong>Benefit</strong>:</p><p>A business with Branding power is able to charge a higher price for its offering due to one or both of these two reasons:</p><ol><li><p>Affective valence - The built-up associations with the brand elicit good feelings about the offering, distinct from the objective value of the good. (Ex Tiffany's)</p></li><li><p>Uncertainty reduction - A customer attains "peace of mind" knowing that the branded product will be just as expected. Consider another example: Bayer aspirin. Search for aspirin on Amazon.com and you will see a 200 count of Bayer 325 mg. aspirin for $9.47 side-by-side with a 500 count of Kirkland 325 mg. aspirin for $10.93. So Bayer has a price per tablet premium of 117%. Some customers still would prefer the Bayer because of diminished uncertainty: Bayer's long history of consistency makes customers more confident that they are getting exactly what they want.</p></li></ol><p>Note that the Benefit from Branding does not depend on prior ownership, as with Switching Costs.</p><p><strong>Barrier</strong>:</p><p>A strong brand can only be created over a lengthy period of reinforcing actions (hysteresis), which itself serves as the key Barrier.</p><div><hr></div><h3>Process (stability phase)</h3><p><strong>Ex</strong>. Toyota Process (TPS)</p><p><strong>Benefit</strong>:</p><p>A company with Process Power is able to improve product attributes and/or lower costs as a result of process improvements embedded within the organization. For example, Toyota has maintained the quality increases and cost reductions of the TPS over a span of decades; these assets do not disappear as new workers are brought in and older workers retire.</p><p><strong>Barrier</strong>:</p><p>These process advances are difficult to replicate, and can only be achieved over a long time period of sustained evolutionary advance. This inherent speed limit in achieving the Benefit results from two factors:</p><ol><li><p>Complexity - Returning to our example: automobile production, combined with all the logistic chains which support it, entails enormous complexity. If process improvements touch many parts of these chains, as they did with Toyota, then achieving them quickly will prove challenging, if not impossible.</p></li><li><p>Opacity - The development of TPS should tip us off to the long time constant inevitably faced by would-be imitators. The system was fashioned from the bottom up, over decades of trial and error. The fundamental tenets were never formally codified, and much of the organizational knowledge remained tacit, rather than explicit. It would not be an exaggeration to say that even Toyota did not have a full, top-down understanding of what they had created &#8212;it took fully fifteen years, for instance, before they were able to transfer TPS to their suppliers. GM's experience with NUMMI also implies the tacit character of this knowledge: even when Toyota wanted to illuminate their work processes, they could not entirely do so.</p></li></ol><div><hr></div><h3>Innovation is the key to achieving any of the powers</h3><p>Innovation is achieved by creating compelling value. There are three distinct paths to creating compelling value.</p><ol><li><p><strong>Capabilities-Led Compelling Value</strong>: Adobe Acrobat. Here the key capability brought to bear was Adobe's existing fluency at the intersection of software and graphics.</p></li><li><p><strong>Customer-Led Compelling Value</strong>: Corning Fiber Optics. Interaction at a distance. The uncertainty in this case is technical: "Can we invent it?"</p></li><li><p><strong>Competitor-Led Compelling Value</strong>: the Sony PlayStation. Sony perceived the value in immersion of 3D graphics and the gap in the competitors' offerings (Sega/Nintendo) to double down on 3D and launch PS1.</p></li></ol>]]></content:encoded></item><item><title><![CDATA[Resuming after 1 month break]]></title><description><![CDATA[Month-long break, Aimap.today & Techstars San Francisco]]></description><link>https://journal.daniellopes.dev/p/resuming-after-1-month-break</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/resuming-after-1-month-break</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Fri, 13 Sep 2024 04:22:09 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!aJ0x!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aJ0x!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aJ0x!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!aJ0x!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!aJ0x!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!aJ0x!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aJ0x!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1963079,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!aJ0x!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!aJ0x!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!aJ0x!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!aJ0x!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e9e280c-5db9-4cde-a571-4c84a35c3443_1456x832.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I've been absent lately, as some of you noticed. I'd set aside three months for a focused "studying" period, and it was a lot of fun to dive deep into prompt engineering, evals, LLM product lifecycles, agents, and more. However, transitioning from my previous full-time job to 8+ hour study days was challenging. Because I'm usually not great at taking breaks (I think I never took more than 2 weeks off), this time I was determined to stick to my three-month deadline and disconnect. I hit the deadline and went completely offline. I had a blast spending a month road cycling in Spain &amp; Germany and a lot of time with family.</p><p>Now, I'm recharged and excited to be back! </p><p>Here are some learnings, a new project I shipped this week, and some plans for the next few months:</p><h2>Aimap.today: An Experiment with Agents &amp; Flows</h2><p>I recently focused on learning about agent tooling to build an AI company researcher for personal use.</p><p>The plan: Create an agent to find funding news, identify AI-related companies, analyze their websites, build a knowledge base, write a long form article about it, and categorize them into three tiers (foundational AI tech, AI-enabled, and AI users).</p><p>I explored several agent frameworks: <a href="https://www.crewai.com/">CrewAI</a>, <a href="https://langchain-ai.github.io/langgraph/">LangGraph</a> (LangChain&#8217;s agent take), <a href="https://www.llamaindex.ai/blog/introducing-llama-deploy-a-microservice-based-way-to-deploy-llamaindex-workflows">LlamaIndex's Llama-Agents</a> (a microservice take on agents, where each agent is it&#8217;s own service), and <a href="https://www.agentops.ai/">AgentOps</a> (very cool instrumentation tool). I also tried building flows with <a href="https://www.airops.com/">AirOps</a> and <a href="https://n8n.io/">n8n</a>. In the process of fetching and enriching the data I also tried <a href="https://jina.ai/reader/">Jina Reader/SERP</a>, <a href="https://you.com/">You.com</a> API, <a href="https://www.apollo.io/">Apollo</a> API, and <a href="https://www.clay.com/claygent">Clay agent</a>.</p><p>Ultimately, for now, I decided to take a stab at coding my own thing. Most existing frameworks focus on LLMs talking to each other, but all I needed was a workflow/pipeline to chain multiple processes together.</p><p>My current boilerplate is in Ruby, and I wasn't satisfied with the existing packages for API clients like Claude and OpenAI. So, I coded these from scratch as well. The whole project, including study time, took about two weeks. </p><p><strong>&#128073; The result is: <a href="https://aimap.today">https://aimap.today</a> . </strong>Please, check it out and let me know what you think.</p><p>I built this for myself and plan to add new companies weekly. There's more I want to do, including a weekly digest for newly researched companies. If you sign-up you&#8217;ll be notified when I finish the weekly digest part.</p><p>My next step is to explore more out-of-the-box platforms like AirOps and n8n, and try to covert my little workflow to these tools.</p><p>Code-wise the pipeline currently looks like this:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!AnT6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!AnT6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png 424w, https://substackcdn.com/image/fetch/$s_!AnT6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png 848w, https://substackcdn.com/image/fetch/$s_!AnT6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png 1272w, https://substackcdn.com/image/fetch/$s_!AnT6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!AnT6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png" width="1200" height="1136" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1136,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:271666,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!AnT6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png 424w, https://substackcdn.com/image/fetch/$s_!AnT6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png 848w, https://substackcdn.com/image/fetch/$s_!AnT6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png 1272w, https://substackcdn.com/image/fetch/$s_!AnT6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f563482-be06-42c0-bd11-270a130469d0_1200x1136.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>Joined Techstars for the First San Francisco class</h2><p>I'm thrilled to share that I&#8217;ll be helping Techstars&#8217; first San Francisco class as an EIR (Entrepreneur in Residence). I've been following my friend and former boss, <a href="https://www.linkedin.com/in/nealsalesgriffin/">Neal</a>, as he ran Techstars Chicago and Oakland. When he mentioned the new San Francisco class, I jumped at the chance to participate.</p><p>The cohort is impressive, featuring many interesting AI-enabled businesses. I'm excited to work with all the founders. You can view the full cohort here: <a href="https://www.techstars.com/newsroom/techstars-announces-inaugural-class-of-san-francisco-accelerator">https://www.techstars.com/newsroom/techstars-announces-inaugural-class-of-san-francisco-accelerator</a></p><div><hr></div><p>It's great to be back. Building aimap.today has been an a cool learning experience, and I'm eager to continue improving and expanding it. </p><p>Joining Techstars as an EIR is another exciting opportunity I'm looking forward to.</p><p>I also may be joining a team on a new project soon, but I'll save that for future updates.</p><p>Going forward, my goal is to resume weekly updates.</p>]]></content:encoded></item><item><title><![CDATA[LLM Impact Mapping: A 4-Step Thought Exercise with Claude 3.5]]></title><description><![CDATA[Using Langchain + Claude 3.5 to brainstorm about AI's effect on common roles]]></description><link>https://journal.daniellopes.dev/p/llm-impact-mapping-a-4-step-thought</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/llm-impact-mapping-a-4-step-thought</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Wed, 10 Jul 2024 13:02:11 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!QE8y!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!QE8y!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!QE8y!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!QE8y!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!QE8y!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!QE8y!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!QE8y!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1996105,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!QE8y!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!QE8y!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!QE8y!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!QE8y!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6a5a006d-314b-4b24-aa78-069078b3a2b4_1456x832.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I'm starting to plan what to do next in the long run, and I&#8217;m entertaining the idea of joining a team. I think things might be moving too fast and too much demand for skills for risking time on solo projects.</p><h3>Three types of play</h3><p>Started considering which spaces are interesting now. I think this take from <a href="https://www.amazon.com/7-Powers-Foundations-Business-Strategy/dp/0998116319">Hamilton Helmer</a> is a good one: Three types of play:</p><ol><li><p>The tech play: example of semiconductors, you get Intel and etc.</p></li><li><p>The first-tier play: companies that wouldn't exist before semiconductors (Microsoft)</p></li><li><p>The usage play: car companies with tons of semiconductors. They already existed, just got better.</p></li></ol><p>The whole video is worth watching, but <a href="https://youtu.be/hKq1_KPSqy0?t=2367">heres&#8217; the time-stamp about AI.</a></p><p>I'm personally more interested in the second play. And finding these companies early and identifying which ones are already in a good direction is so hard.</p><h3>Brainstorm exercise</h3><p>A good place to start could be looking at common roles and how they will be impacted and can be augmented or even replaced. </p><p>A programmer's job is already seriously impacted, customer support even more so. Same for SDRs. Some other roles, like localization, have already been pretty much replaced. These highly impacted roles are the things I'm most interested in.</p><p>So, just as a thought exercise toying with Python + Langchain, I wrote a 4-step chain with Claude 3.5 to do following:</p><ol><li><p>Brainstorm departments and roles common across any vertical</p></li><li><p>List the most common tasks in those roles</p></li><li><p>Brainstorm ideas for how each could be augmented or automated</p></li><li><p>(Rank them by difficulty and confidence, but this was just for nudging the model)</p></li></ol><p>I only spent a couple of hours and $3 in credits, but the results turned out better than I was expecting. </p><p>There's obviously some nonsense and some things that are regular ML models more than LLMs in the output (my fault, I should have optimized the last step of the chain better), but a lot of solid ones that I think haven't happened yet, and some others in the &#8220;augmentation&#8221; category fully in progress like Figma, Intercom, Cursor, GitHub Copilot, etc. If you squint and look at the roles with the best ideas, you can see <a href="https://www.qualified.com/ai-sdr">Piper</a>, <a href="https://www.intercom.com/drlp/ai-chatbot">Fin</a>, <a href="https://www.harvey.ai/">Harvey</a>, <a href="https://www.wisq.com/">Wisq</a>, <a href="https://www.cognition.ai/blog/introducing-devin">Devin</a>, <a href="https://www.11x.ai/">11x.ai</a>, etc</p><h3>You can see the whole list here:</h3><p><a href="https://docs.google.com/spreadsheets/d/1vpXzfUFNhIZJp9CEmiZqGKaVrP_Xx8McMZZlLCxiHNg/edit?gid=0#gid=0">https://docs.google.com/spreadsheets/d/1vpXzfUFNhIZJp9CEmiZqGKaVrP_Xx8McMZZlLCxiHNg/edit?gid=0#gid=0</a></p><p>Next step, I guess, is to start collecting companies and maybe have a little agent help me with the research.</p><p>&#8212;</p><p>I haven't had time to think through the whole list yet, but here...Some of the ones that stood out that the ideas are mostly very doable, or already happening:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!XTZp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!XTZp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png 424w, https://substackcdn.com/image/fetch/$s_!XTZp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png 848w, https://substackcdn.com/image/fetch/$s_!XTZp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png 1272w, https://substackcdn.com/image/fetch/$s_!XTZp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!XTZp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png" width="1456" height="851" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:851,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:187794,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!XTZp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png 424w, https://substackcdn.com/image/fetch/$s_!XTZp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png 848w, https://substackcdn.com/image/fetch/$s_!XTZp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png 1272w, https://substackcdn.com/image/fetch/$s_!XTZp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3d6389c-8d57-47e6-b30e-6b981d6c1c1d_1858x1086.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!BlOh!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!BlOh!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png 424w, https://substackcdn.com/image/fetch/$s_!BlOh!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png 848w, https://substackcdn.com/image/fetch/$s_!BlOh!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png 1272w, https://substackcdn.com/image/fetch/$s_!BlOh!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!BlOh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png" width="1456" height="846" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:846,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:207782,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!BlOh!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png 424w, https://substackcdn.com/image/fetch/$s_!BlOh!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png 848w, https://substackcdn.com/image/fetch/$s_!BlOh!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png 1272w, https://substackcdn.com/image/fetch/$s_!BlOh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F08d1abb9-6463-47a3-adda-95de741f35e2_1856x1078.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DYQl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DYQl!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png 424w, https://substackcdn.com/image/fetch/$s_!DYQl!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png 848w, https://substackcdn.com/image/fetch/$s_!DYQl!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png 1272w, https://substackcdn.com/image/fetch/$s_!DYQl!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DYQl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png" width="1456" height="986" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:986,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:230704,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!DYQl!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png 424w, https://substackcdn.com/image/fetch/$s_!DYQl!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png 848w, https://substackcdn.com/image/fetch/$s_!DYQl!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png 1272w, https://substackcdn.com/image/fetch/$s_!DYQl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0c1d25e5-330e-49f3-a50d-c7771bd1eb30_1672x1132.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!D9Q-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!D9Q-!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png 424w, https://substackcdn.com/image/fetch/$s_!D9Q-!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png 848w, https://substackcdn.com/image/fetch/$s_!D9Q-!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png 1272w, https://substackcdn.com/image/fetch/$s_!D9Q-!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!D9Q-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png" width="1456" height="621" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:621,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:124803,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!D9Q-!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png 424w, https://substackcdn.com/image/fetch/$s_!D9Q-!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png 848w, https://substackcdn.com/image/fetch/$s_!D9Q-!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png 1272w, https://substackcdn.com/image/fetch/$s_!D9Q-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6899800e-3f2c-4dba-b440-013c5a15e85f_1670x712.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!dRf6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!dRf6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png 424w, https://substackcdn.com/image/fetch/$s_!dRf6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png 848w, https://substackcdn.com/image/fetch/$s_!dRf6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png 1272w, https://substackcdn.com/image/fetch/$s_!dRf6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!dRf6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png" width="1456" height="945" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:945,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:206206,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!dRf6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png 424w, https://substackcdn.com/image/fetch/$s_!dRf6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png 848w, https://substackcdn.com/image/fetch/$s_!dRf6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png 1272w, https://substackcdn.com/image/fetch/$s_!dRf6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd07a9170-2afa-4b1d-8b8f-77506ab052a4_1670x1084.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p>]]></content:encoded></item><item><title><![CDATA[Week 10: Mostly LLM Evaluations]]></title><description><![CDATA[Machine learning + Evaluations]]></description><link>https://journal.daniellopes.dev/p/week-10-mostly-llm-evaluations</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/week-10-mostly-llm-evaluations</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Mon, 08 Jul 2024 22:24:14 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The first week of July has already passed. If you're new here, two months ago, I left my full-time job to focus on exploring LLMs. While I've learned a lot in just two months, June wasn't as productive as I'd hoped, with too much time spent on non-AI coding projects, leaving me with time for just theory and conferences. I'm aiming to correct this in July.</p><h2>Traditional machine learning</h2><p>When you move beyond simply calling APIs, you'll encounter the traditional ML stack and terminology. For instance, if you need to adjust your embeddings and choose a different model, you'll need a solid grasp of metrics like BPB, BPC, Perplexity, and so on. The same goes for evaluating production data.</p><p>I also prefer not to have components in a stack that I don't understand at a high level, even if I don't have to work with them directly (for example, if I'm an EM on the product side or a product manager).</p><p>For ML theory I&#8217;m currently <a href="https://www.youtube.com/playlist?list=PLrQmbzbRJ5mwDinvDEJ5B-KDZlPM-sCYO">half way through this course</a>.  It may be a few years old, but I appreciate how practical and real-world focused it is. If anyone has other suggestions, I'm very open to them.</p><p>I can't recall the exact source of this slide (I think it was from <a href="https://x.com/HamelHusain?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor">Hamel Hussain&#8217;s talk</a> on AI Engineer conf this month) , but I believe it provides a reasonable overview - and my goal is to have at least a basic understanding of everything under the horizontal line.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!I8IB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!I8IB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png 424w, https://substackcdn.com/image/fetch/$s_!I8IB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png 848w, https://substackcdn.com/image/fetch/$s_!I8IB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png 1272w, https://substackcdn.com/image/fetch/$s_!I8IB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!I8IB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png" width="1456" height="865" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:865,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:842510,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!I8IB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png 424w, https://substackcdn.com/image/fetch/$s_!I8IB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png 848w, https://substackcdn.com/image/fetch/$s_!I8IB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png 1272w, https://substackcdn.com/image/fetch/$s_!I8IB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe270dc33-bd20-45fd-a9e8-47ba92e5bec0_1472x874.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>Evaluations</h2><p>Last week, I began studying evaluations more seriously for two reasons. First, I'm working on understanding the various stages of the development cycle. Second, I need to establish a better workflow for iterating on our <a href="https://canopy.is/m/training/ai_assistant">AI bot at Canopy</a>.</p><p>For theory around evals I really liked <a href="https://x.com/chipro">Chip</a>&#8217;s <a href="https://learning.oreilly.com/library/view/ai-engineering/9781098166298/ch03.html#why_ai_as_a_judge">third chapter of her WIP book</a>. This <a href="https://hamel.dev/blog/posts/evals/">post about Evaluations from Hamel Husain</a> is also so good, and <a href="https://applied-llms.org/#evaluation-monitoring">this one as well</a>.</p><p><strong>This is would be the ideal cycle for me:</strong></p><p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Or6c!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Or6c!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png 424w, https://substackcdn.com/image/fetch/$s_!Or6c!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png 848w, https://substackcdn.com/image/fetch/$s_!Or6c!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png 1272w, https://substackcdn.com/image/fetch/$s_!Or6c!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Or6c!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png" width="1456" height="887" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:887,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:144725,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Or6c!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png 424w, https://substackcdn.com/image/fetch/$s_!Or6c!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png 848w, https://substackcdn.com/image/fetch/$s_!Or6c!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png 1272w, https://substackcdn.com/image/fetch/$s_!Or6c!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43931fc2-a950-4d4e-8662-44e8ded1f3a4_2144x1306.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">This is sort of the workflow that I think it would be ideal for me.</figcaption></figure></div><p>At Canopy, our tech stack is entirely Ruby, and our AI Assistant was built using direct API calls to OpenAI. While I did implement some observability, I still need a more robust way to monitor the quality of answers in production and a solid test suite that I can run when updating the model or modifying the prompts.</p><p>Ideally, I would have automated tests similar to regular test units that I can run whenever I'm iterating on prompts, models, or content format for the rag. Once things are deployed, I'd like to have comparable quality metrics for the production data, with alerts if quality drops below certain thresholds and proper visibility.</p><p>There are quite a few LLM-focused evaluation tools available, and many of them are really solid. I don't mean to dismiss their work, but for my specific context, here's my current assessment:</p><ol><li><p>They do too much</p><ol><li><p>Most tools attempt to offer solutions for all four stages of my diagram, but these companies are often small, seed-level teams building in a rapidly evolving space. Many of these products feel either half-baked or are still trying to figure out their business model. I'd prefer a tool that focuses solely on the two evaluation parts (dev &amp; production) and doesn't try to handle the prototyping and deployment aspects.</p></li></ol></li><li><p>Their business models are still unclear</p><ol><li><p>Many tools are in the typical early-stage B2B mode, lacking a product-led path and aiming to lock users into annual contracts with case-by-case pricing. I understand this is important in a new market with early-stage products, but I'd love to see more product-led options (especially for Canopy's context, a small team where the bot isn't their core feature yet).</p></li></ol></li><li><p>Vendor lock in / dependency risk</p><ol><li><p>To enable tracing, many tools heavily wrap around API clients for services like OpenAI, and some don't even offer public API versions. Tools that offer "solutions" for the deployment part often proxy calls under their own API, so instead of calling OpenAI directly, you call their API. This introduces a third-party proxy as a potential failure point, in addition to OpenAI/Anthropic/etc., and it's built by an understaffed seed company. </p></li></ol></li><li><p>Pretty much no solutions for non-Python/JavaScript stacks</p><ol><li><p>Most niceties for automatic logging/tracing and basic metrics like latency come from their SDK, which are often limited to Python/JS. While understandable given the early stage, most tools also don't offer a public API. For more elaborate setups, a reasonable workflow could be to follow the traditional ML path of using Python for the stack and wrapping it as an API for other languages to call, instead of relying on a proxy maintained by a tiny company in a crowded space. It's a bit of a bummer because for AI products that only need LLMs, it's an overcomplication in the stack. You'll definitely need query augmentation, retries, rate-limiting controls, and other things a product-level software already handles, and now you'll need to duplicate that in the Python stack too. It's one thing when you're making an API around a model you control, but it's another when you're making an API around another API.</p></li></ol></li></ol><p>If you're already using a Python or JS stack, you have fewer things to worry about. However, for Canopy (Ruby stack), the situation is a bit different. I think I'll go with the following approach:</p><ul><li><p><strong>First:</strong> Find a tool that has a public API support for sending run data for production, build a small Ruby client for that, and do production level monitoring only (similar to how&#8217;d use a service like Sentry).  Ignore all the other 3 stages of  the Experiment and Deploy part of these tools.</p><ul><li><p><strong>Second:</strong> </p><ul><li><p>Have a good unit test coverage for the direct calls to OpenAI with VCR and reimplement some of the basic eval metrics that relevant to my context like <strong>Answer Relevance, Context Recall, or Context Relevancy. </strong></p></li><li><p>Or use something like <a href="https://www.promptfoo.dev/">Promptfoo</a>, <a href="https://ragas.io/">Ragas</a> or <a href="https://www.guardrailsai.com/">GuardRails</a> for the local unit tests.</p></li></ul></li></ul><p></p></li></ul><p>I haven't decided which product to use yet, but reading through their documentation, examining the code of their SDKs, and exploring their offerings has been a good learning process already.</p><p>In no particular order, here's the current list of evaluation products/projects that I'm aware of (although I haven't had a chance to thoroughly check all of them yet):</p><ul><li><p>https://mlflow.org</p></li><li><p>https://www.vellum.ai/</p></li><li><p>https://athina.ai</p></li><li><p>https://www.patronus.ai</p></li><li><p>https://wandb.ai/site/weave</p></li><li><p>https://autochain.forethought.ai/</p></li><li><p>https://ragas.io/</p></li><li><p>https://github.com/agenta-ai/agenta</p></li><li><p>https://klu.ai/</p></li><li><p>https://www.braintrustdata.com/</p></li><li><p>https://humanloop.com/</p></li><li><p>https://www.parea.ai/</p></li><li><p>https://docs.llamaindex.ai/en/stable/understanding/evaluating/evaluating/</p></li><li><p>https://www.langchain.com/langsmith</p></li><li><p>https://github.com/langfuse/langfuse</p></li><li><p>https://www.rungalileo.io/</p></li><li><p>https://www.guardrailsai.com/</p></li><li><p>https://deepchecks.com</p></li><li><p>https://arize.com</p></li><li><p>https://github.com/UKGovernmentBEIS/inspect_ai</p></li></ul><p>If you're using a tool that you really like or if you have any feedback or a different perspective on the market, I'd love to hear your thoughts.</p>]]></content:encoded></item><item><title><![CDATA[Month 2]]></title><description><![CDATA[Lots of papers, coffee chats, contract work, and conferences&#8230;]]></description><link>https://journal.daniellopes.dev/p/week-5-6-and-7</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/week-5-6-and-7</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Thu, 27 Jun 2024 03:35:40 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I've been away for a couple of weeks, spent probably 30% of my time networking and meeting people, and the rest helping my friends at <a href="https://getkoala.com">Koala</a>. This meant missing two weekly posts and delaying the public release of <a href="https://replayjournal.ai">Replay</a>. The app is working well, but I&#8217;m having a tough time finding time for the last details and GTM.</p><p>While I haven't done much hands-on AI engineering this last two weeks, I've been keeping up with theory. Here's some of the notes:</p><h2>Daily Prompting Paper</h2><p>For a while now, I've been checking <a href="https://arxiv.org/search/?query=LLM+prompt&amp;searchtype=all&amp;source=header">Arxiv daily for new papers on Prompt Engineering</a>. It's a topic that's usually easy to understand, even in academic papers, so great way to kill time between tasks.</p><p>Many poor results in AI come from inadequate prompting skills + techniques like prompt chaining and retries, weren't practical before, but now make sense.</p><p>In June, <a href="https://arxiv.org/abs/2406.06608">this excellent paper</a> was published. I spent the last couple of weeks reading it thoroughly, along with many of its citations. I've <a href="https://journal.daniellopes.dev/p/practical-prompt-engineering-notes">compiled some notes here</a>, and I'm also planning to create a better structured resource with more examples and additional notes I didn't include.</p><h2>Meetups and Conferences</h2><p>I've been attending several meetups and conferences, including <a href="https://www.aiqualityconference.com">AIQCon</a> yesterday. It's interesting to see the growing focus on AI Engineering (meaning implementing GenAI in products). Last year, there a lot of interest in LLMs themselves, but now the emphasis is on making LLMs perform well in real-world applications.</p><p>Key topics include:</p><ul><li><p>RAG (Retrieval-Augmented Generation): Techniques to improving embeddings, content augmentation, and advanced techniques like GraphRAG.</p></li><li><p>Evaluation: Setting up good test suites, deciding what to test, and determining who should write the tests.</p></li><li><p>No fine-tuning: The consensus is to focus on prompting, RAG, and evaluation before considering fine-tuning. Lot of pain and scar from fine-tuning already.</p></li></ul><p>While some events feel like they have a high noise-to-signal ratio, I always learn a lot as a newcomer. Also, it&#8217;s such a great time to live in SF. For future conferences, I plan to research the participants and their topics more thoroughly beforehand.</p><p>A useful resource from yesterday's conference: <a href="https://huyenchip.com/llama-police.html">https://huyenchip.com/llama-police.html</a> by <a href="https://x.com/chipro">Chip Huyen</a> (well worth following on X).</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!O9hC!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!O9hC!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png 424w, https://substackcdn.com/image/fetch/$s_!O9hC!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png 848w, https://substackcdn.com/image/fetch/$s_!O9hC!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png 1272w, https://substackcdn.com/image/fetch/$s_!O9hC!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!O9hC!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png" width="1396" height="1280" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/be49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1280,&quot;width&quot;:1396,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2923913,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!O9hC!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png 424w, https://substackcdn.com/image/fetch/$s_!O9hC!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png 848w, https://substackcdn.com/image/fetch/$s_!O9hC!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png 1272w, https://substackcdn.com/image/fetch/$s_!O9hC!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbe49d4ae-5b73-4b6c-8c1b-77d9467ffa75_1396x1280.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>Books</h2><p>On the topic of learning more theory, I'm currently reading two books:</p><p>1. <strong>Programming Machine Learning</strong></p><p><a href="https://www.amazon.com/Programming-Machine-Learning-Zero-Deep/dp/1680506609">https://www.amazon.com/Programming-Machine-Learning-Zero-Deep/dp/1680506609</a></p><p>I first read this when it was released, but I've forgotten most of it. Although it's a bit outdated and pre-LLM, I like the author's writting. Given how close AI Engineering is to traditional ML, I want to revisit this material.</p><p><strong>2. Neural Nets mini-course by 3Blue1Brown</strong></p><p>Another basic entry level material I was revisiting this week is this mini-course on neural nets: <a href="https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi">https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi</a></p><p>2. <strong>Oreilly&#8217;s</strong> <strong>AI Engineering book</strong></p><p><a href="https://www.oreilly.com/library/view/ai-engineering/9781098166298/cover.html">https://www.oreilly.com/library/view/ai-engineering/9781098166298/cover.html</a></p><p>This book was recommended by a close person, and I attended a talk by the author yesterday - it was one of my favorites, despite being too short. Only three chapters are available now, so I'll keep an eye out for future updates.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!qsdD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!qsdD!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!qsdD!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!qsdD!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!qsdD!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!qsdD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1804718,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!qsdD!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!qsdD!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!qsdD!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!qsdD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F03bc19a5-6386-4d94-bb64-4fac0fa22b65_1456x832.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div>]]></content:encoded></item><item><title><![CDATA[A Comprehensive Guide to Text Prompt Engineering Techniques]]></title><description><![CDATA[My Summary and Notes from all Text-Based Prompting Techniques in "The Prompt Report" Paper and many of its cited works]]></description><link>https://journal.daniellopes.dev/p/practical-prompt-engineering-notes</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/practical-prompt-engineering-notes</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Fri, 21 Jun 2024 13:02:40 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!6dAo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!6dAo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!6dAo!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!6dAo!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!6dAo!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!6dAo!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!6dAo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1901136,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!6dAo!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!6dAo!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!6dAo!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!6dAo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ed93d0d-18ce-4088-bd4d-92de08feebd7_1456x832.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I read "<strong><a href="https://arxiv.org/pdf/2406.06608">The Prompt Report: A Systematic Survey of Prompting Techniques</a></strong>" from start to finish, as well as many of the papers it cited and extracted all the things that I thought were useful techniques to have in the text-prompting toolbelt (left out all the benchmarking and multi-modal techniques).</p><p>Here's everything; maybe you'll find it useful too:</p><h2>1. In-Context Learning (ICL) and Few-Shot Prompting</h2><p>In-Context Learning (ICL) is a type of few-shot prompting technique. It allows AI models to perform tasks based on a small number of examples or instructions within the prompt, without additional training.</p><h3>1.1 Key Concepts of ICL</h3><ul><li><p>Skills can be derived from exemplars or instructions in the prompt</p></li><li><p>May utilize pre-existing knowledge rather than true "learning"</p></li><li><p>Viewed as task specification rather than acquiring new skills</p></li><li><p>Current research focuses on optimization and understanding of ICL</p></li></ul><h3>1.2 Examples of ICL</h3><ol><li><p>Learning from exemplars:</p></li></ol><pre><code><code>System:
Translate English to French:
English: Hello
French: Bonjour
English: Goodbye
French: Au revoir
---

User:
English: How are you?</code></code></pre><ol start="2"><li><p>Learning from instructions:</p></li></ol><pre><code><code>System Prompt:
Identify the subject and predicate in sentences. The subject is the doer, and the predicate is the action.

Example:
Sentence: The cat chased the mouse.
Subject: The cat
Predicate: chased the mouse

Now, your turn:
--

User:
Sentence: The sun rises in the east.
[Model identifies subject and predicate]</code></code></pre><ol start="3"><li><p>Task specification:</p></li></ol><pre><code><code>Prompt:
Generate a haiku about artificial intelligence. A haiku is a three-line poem with five syllables in the first line, seven in the second, and five in the third.</code></code></pre><p>ICL enables flexible task performance without fine-tuning, but its effectiveness depends on prompt design and example selection. The technique is limited by the model's maximum input length and may not represent true learning of new skills.</p><h3>1.3 Few-Shot Prompting Design Decisions</h3><p>The effectiveness of ICL and few-shot prompting techniques depends significantly on how the examples (exemplars) are designed <strong>and presented</strong>. Here are six critical design decisions:</p><h4>1.3.1 Exemplar Quantity</h4><ul><li><p>Generally, more exemplars improve performance, especially in larger models.</p></li><li><p>Benefits may diminish beyond 20 exemplars in some cases.</p></li></ul><h4>1.3.2 Exemplar Ordering</h4><ul><li><p>The order of exemplars affects model behavior.</p></li><li><p>On some tasks, exemplar order can cause accuracy to vary from below 50% to over 90%.</p></li></ul><h4>1.3.3 Exemplar Label Distribution</h4><ul><li><p>The distribution of labels in exemplars can bias the model's output.</p></li><li><p>Example: 10 exemplars of one class and 2 of another may bias the model towards the first class.</p></li></ul><h4>1.3.4 Exemplar Label Quality</h4><ul><li><p>The necessity of strictly valid demonstrations is unclear.</p></li><li><p>Some studies suggest label accuracy may be irrelevant, while others show significant impact.</p></li><li><p>Larger models often handle incorrect or unrelated labels better.</p></li></ul><blockquote><p>Note: When automatically constructing prompts from large datasets, consider studying how label quality affects your results.</p></blockquote><h4>1.3.5 Exemplar Format</h4><ul><li><p>Common format: "Q: {input}, A: {label}"</p></li><li><p>Optimal format may vary across tasks; experiment with multiple formats.</p></li><li><p>Formats common in the training data may lead to better performance.</p></li></ul><h4>1.3.6 Exemplar Similarity</h4><ul><li><p>Selecting exemplars similar to the test sample is generally beneficial.</p></li><li><p>In some cases, more diverse exemplars can improve performance.</p></li></ul><p>These design decisions critically influence output quality in few-shot prompting and ICL.</p><h2>2. Few-Shot Prompting Techniques</h2><p>Few-Shot Prompting can be challenging to implement effectively. Here are some techniques that can be used in a supervised setting:</p><h3>2.1 Techniques for Selecting Exemplars</h3><h4>2.1.1 K-Nearest Neighbor (KNN)</h4><ul><li><p>Run your examples through KNN and selects exemplars similar to test to boost performance</p></li><li><p>Effective but can be time and resource-intensive during prompt generation</p></li></ul><h4>2.1.2 Vote-K</h4><ul><li><p>Two-stage process for selecting similar exemplars:</p></li></ul><ol><li><p>Model proposes useful unlabeled candidate exemplars for annotation</p></li><li><p>Labeled pool is used for Few-Shot Prompting</p></li></ol><ul><li><p>Ensures diversity and representativeness of exemplars</p></li></ul><h4>2.1.3 Self-Generated In-Context Learning (SG-ICL)</h4><ul><li><p>Uses a generative AI to automatically generate exemplars</p></li><li><p>Better than zero-shot scenarios when training data is unavailable</p></li><li><p>Generated samples are not as effective as actual data</p></li></ul><p>To Consider: <a href="https://developer.nvidia.com/blog/leverage-our-latest-open-models-for-synthetic-data-generation-with-nvidia-nemotron-4-340b/">https://developer.nvidia.com/blog/leverage-our-latest-open-models-for-synthetic-data-generation-with-nvidia-nemotron-4-340b/</a></p><h3>2.2 Prompt Mining</h3><ul><li><p>Process of discovering optimal "middle words" in prompts through large corpus analysis</p></li><li><p>Aims to find prompt formats that occur frequently in the corpus</p></li><li><p>Frequently occurring formats may lead to improved prompt performance</p></li><li><p>Example: Instead of using "Q: A:" format, finding a similar but more common format in the corpus</p></li></ul><p>These techniques can help improve the effectiveness of few-shot prompting by optimizing exemplar selection and prompt format. The choice of technique may depend on the specific task, available resources, and the characteristics of the dataset and model being used.</p><h2>3. Zero-Shot Prompting Techniques</h2><p>Zero-Shot Prompting uses no exemplars, in contrast to Few-Shot Prompting. Here are several standalone zero-shot techniques:</p><h3>3.1 Role Prompting (Persona Prompting)</h3><ul><li><p>Assigns a specific role to the AI in the prompt (e.g., "You are a travel writer")</p></li><li><p>Can improve outputs for open-ended tasks and sometimes improve accuracy on benchmarks</p></li></ul><p>Example:</p><pre><code><code>You are an experienced travel writer for a luxury lifestyle magazine. Describe the experience of visiting the {{city}} in {{country}}, focusing on the sensory details and exclusive experiences a high-end traveler might enjoy.</code></code></pre><h3>3.2 Style Prompting</h3><ul><li><p>Specifies desired style, tone, or genre in the prompt</p></li><li><p>Similar effect can be achieved using role prompting</p></li></ul><p>Examples:</p><pre><code><code>Write a short sales pitch for a {{product}} in a persuasive, benefit-focused style. Emphasize how the product solves customer problems in a straightforward way.</code></code></pre><pre><code><code>Compose a follow-up email to a potential client {{client_name}} after a sales call, using a friendly and conversational tone. The goal is to build rapport and schedule a product demo.</code></code></pre><pre><code><code>Convert the following git commit message and code diff into a clear, concise changelog entry. Use a professional tone and focus on the user-facing changes and improvements. The entry should be understandable to non-technical users.</code></code></pre><h3>3.3 Emotion Prompting</h3><ul><li><p>Incorporates phrases of psychological relevance to humans (e.g., "This is important to my career")</p></li><li><p>May lead to improved performance on benchmarks and open-ended text generation</p></li></ul><h3>3.4 System 2 Attention (S2A)</h3><ol><li><p>Asks the AI to rewrite the prompt, removing unrelated information</p></li><li><p>Passes the new prompt to the AI for a final response</p></li></ol><p>Example:</p><pre><code><code>System prompt:

Given the following text by a user, extract the part that is unbiased and not their opinion, so that using that text alone would be good context for providing an unbiased answer to the question portion of the text.

Please include the actual question or query that the user is asking. Separate this into two categories labeled with &#8220;Unbiased text context (includes all content except user&#8217;s bias):&#8221; and &#8220;Question/Query (does not include user bias/preference):&#8221;.
--

Quest: Text by User: [ORIGINAL INPUT PROMPT]</code></code></pre><p><a href="https://arxiv.org/pdf/2311.11829">https://arxiv.org/pdf/2311.11829</a></p><h3>3.5 SimToM (Simulation Theory of Mind)</h3><ul><li><p>Deals with complicated questions involving multiple people or objects</p></li></ul><ol><li><p>Establishes the set of facts one person knows</p></li><li><p>Answers the question based only on those facts</p></li></ol><ul><li><p>Helps eliminate the effect of irrelevant information in the prompt</p></li></ul><pre><code><code>Single Prompt:

Your task is in two steps.
Step 1. output only the events that
{character_name} knows about.
Step 2. Imagine you are {character_name},
then answer a question based only on the
events {character_name} knows about.
Story: {story}
Question: {question}</code></code></pre><pre><code><code>The following is a sequence of events about some characters, that takes place in multiple locations.
Your job is to output only the events that the specified character, {character}, knows about.
Here are a few rules:
1. A character knows about all events that they do.
2. If a character is in a certain room/location, that character knows about all other events that happens in the room. This includes other characters leaving or exiting the location, the locations of objects in that location, and whether somebody moves an object to another place. 3. If a character leaves a location, and is NOT in that location, they no longer know about any events that happen within that location. However, they can re-enter the location.
Story: {story}
What events does {character} know about?
Only output the events according to the
above rules, do not provide an
explanation.</code></code></pre><pre><code><code>{perspective}
You are {name}.

Based on the above information, answer the following question:

{question}

Keep your answer concise, one sentence is
enough. You must choose one of the above
choices.</code></code></pre><pre><code><code>Imagine you are {name}, and consider this story that has an unexpected event.

{story}

If the last sentence of the story says {name} notices, sees or realizes the unexpected event in this story, simply output the original story with nothing
changed.

However, if the sentence says you are not aware of the changes in this story, output only the events you know, i.e., the sentences before the unexpected event happens.

Output either the original story or the edited story, nothing else.

Format your answer as follows:
Sees/Notices/Realizes: (Yes/No)

Story:</code></code></pre><pre><code><code>Answer the questions based on the context.
Keep your answer concise, few words are
enough, maximum one sentence. Answer as

&#8217;Answer:&lt;option&gt;)&lt;answer&gt;&#8217;.
{perspective}

You are {name}.
{question}

Choose the most straightforward answer.</code></code></pre><pre><code><code>Task Instructions:

I will give you an excerpt. Your task involves three steps:

1. Identify the Sentence Describing the Unexpected Change:

Find the sentence that describes how the situation unexpectedly changed.

2. Determine the Main Character's Awareness:

Identify if the main character comes to know about, or notices, this change at the end.  

3. Edit Based on Awareness:

- If the main character does not know about this change, edit the excerpt and output the part of the excerpt BEFORE the sentence that describes the change.

- If the main character does know about the change, do not edit the excerpt, and output the original story.

Examples:  

Story:

Olumide, a skilled woodcarver in a Nigerian village, is preparing to carve a beautiful sculpture for the village chief. Olumide wants to use a sharp chisel to create intricate details on the sculpture. Olumide observes his set of chisels and sees one that appears to be sharp and in perfect condition. However, while Olumide is talking to a fellow artisan, a child from the village accidentally drops the chisel, causing it to become blunt and damaged. Olumide does not notice the damaged chisel on the ground.

Sentence: However, while Olumide is talking to a fellow artisan, a child from the village accidentally drops the chisel, causing it to become blunt and damaged.

Knows about or notices change: No  

Edit:
Olumide, a skilled woodcarver in a Nigerian village, is preparing to carve a beautiful sculpture for the village chief. Olumide wants to use a sharp chisel to create intricate details on the sculpture. Olumide observes his set of chisels and sees one that appears to be sharp and in perfect condition.

Story:

{story}</code></code></pre><p><a href="https://arxiv.org/pdf/2311.10227">https://arxiv.org/pdf/2311.10227</a></p><h3>3.6 Rephrase and Respond (RaR)</h3><ul><li><p>Instructs the AI to rephrase and expand the question before generating the final answer</p></li><li><p>Can be done in a single pass or as separate steps</p></li><li><p>Has shown improvements on multiple benchmarks</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!gsGg!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!gsGg!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png 424w, https://substackcdn.com/image/fetch/$s_!gsGg!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png 848w, https://substackcdn.com/image/fetch/$s_!gsGg!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png 1272w, https://substackcdn.com/image/fetch/$s_!gsGg!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!gsGg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png" width="1456" height="439" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:439,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:283245,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!gsGg!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png 424w, https://substackcdn.com/image/fetch/$s_!gsGg!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png 848w, https://substackcdn.com/image/fetch/$s_!gsGg!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png 1272w, https://substackcdn.com/image/fetch/$s_!gsGg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F938bd70c-e97d-4370-bb24-9edf9b4669e4_2520x760.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><a href="https://arxiv.org/pdf/2311.04205">https://arxiv.org/pdf/2311.04205</a></p><h3>3.7 Re-reading (RE2)</h3><ul><li><p>Adds the phrase "Read the question again:" to the prompt and repeats the question</p></li><li><p>Simple technique that has shown improvement in reasoning benchmarks, especially with complex questions</p></li></ul><p>Example:</p><pre><code><code>########### Task Instruction ##############

# You will write python program to solve math problems.

# You will only write code blocks.

# Please generate your code block in 'def solution()' function, and thus it can be executed by python interpreter. You don't need to call 'solution()' function because

# it will be called by the system.

# The concrete format of 'solution()' is as follows:

# def solution():

# """&lt;question&gt;"""

# &lt;your code&gt;

# result = &lt;your result&gt;

# return result

#########################################

# Q: {question}

# Read the question again: {question}

# Your defined "solution()" function with comments here.</code></code></pre><p>https://arxiv.org/pdf/2309.06275</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ckBG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ckBG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png 424w, https://substackcdn.com/image/fetch/$s_!ckBG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png 848w, https://substackcdn.com/image/fetch/$s_!ckBG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png 1272w, https://substackcdn.com/image/fetch/$s_!ckBG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ckBG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png" width="1456" height="1166" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1166,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:253233,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ckBG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png 424w, https://substackcdn.com/image/fetch/$s_!ckBG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png 848w, https://substackcdn.com/image/fetch/$s_!ckBG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png 1272w, https://substackcdn.com/image/fetch/$s_!ckBG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66e06a66-9658-41f2-9f67-61c6360eca89_1524x1220.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!w-y7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!w-y7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png 424w, https://substackcdn.com/image/fetch/$s_!w-y7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png 848w, https://substackcdn.com/image/fetch/$s_!w-y7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png 1272w, https://substackcdn.com/image/fetch/$s_!w-y7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!w-y7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png" width="1456" height="1353" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1353,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:233274,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!w-y7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png 424w, https://substackcdn.com/image/fetch/$s_!w-y7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png 848w, https://substackcdn.com/image/fetch/$s_!w-y7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png 1272w, https://substackcdn.com/image/fetch/$s_!w-y7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe6ade8a0-3754-4406-8f81-d4216a028586_1474x1370.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>3.8 Self-Ask</h3><ol><li><p>Prompts the AI to decide if follow-up questions are needed</p></li><li><p>If needed, the AI generates these questions</p></li><li><p>The AI answers the follow-up questions</p></li><li><p>Finally, the AI answers the original question</p></li></ol><p>These zero-shot techniques can be used individually or in combination with other prompting methods to improve AI performance on various tasks without the need for task-specific examples.</p><p>Paper has both a zero-shot and the following four-shot example:</p><pre><code>prompt = ['''Question: Who lived longer, Muhammad Ali or Alan Turing?
Are follow up questions needed here: Yes.
Follow up: How old was Muhammad Ali when he died?
Intermediate answer: Muhammad Ali was 74 years old when he died.
Follow up: How old was Alan Turing when he died?
Intermediate answer: Alan Turing was 41 years old when he died.
So the final answer is: Muhammad Ali 

Question: When was the founder of craigslist born?
Are follow up questions needed here: Yes.
Follow up: Who was the founder of craigslist?
Intermediate answer: Craigslist was founded by Craig Newmark.
Follow up: When was Craig Newmark born?
Intermediate answer: Craig Newmark was born on December 6, 1952.
So the final answer is: December 6, 1952

Question: Who was the maternal grandfather of George Washington?
Are follow up questions needed here: Yes.
Follow up: Who was the mother of George Washington?
Intermediate answer: The mother of George Washington was Mary Ball Washington.
Follow up: Who was the father of Mary Ball Washington?
Intermediate answer: The father of Mary Ball Washington was Joseph Ball.
So the final answer is: Joseph Ball 

Question: Are both the directors of Jaws and Casino Royale from the same country? 
Are follow up questions needed here: Yes. 
Follow up: Who is the director of Jaws? 
Intermediate Answer: The director of Jaws is Steven Spielberg. 
Follow up: Where is Steven Spielberg from? 
Intermediate Answer: The United States. 
Follow up: Who is the director of Casino Royale? 
Intermediate Answer: The director of Casino Royale is Martin Campbell. 
Follow up: Where is Martin Campbell from? 
Intermediate Answer: New Zealand. 
So the final answer is: No

Question: ''', 
'''
Are follow up questions needed here:''', ]</code></pre><p>https://github.com/ofirpress/self-ask?tab=readme-ov-file</p><h2>4. Thought Generation Techniques</h2><p>Thought Generation encompasses techniques that prompt the AI to articulate its reasoning while solving a problem. These methods can significantly enhance performance in mathematics and reasoning tasks.</p><h3>4.1 Chain-of-Thought (CoT) Prompting</h3><ul><li><p>Uses few-shot prompting to encourage the AI to express its thought process</p></li><li><p>Prompt includes an exemplar with a question, reasoning path, and correct answer</p></li><li><p>Example format:</p></li></ul><pre><code><code>Q: Jack has two baskets, each containing three balls. How many balls does Jack have in total?

A: One basket contains 3 balls, so two baskets contain 3 * 2 = 6 balls.

Q: {QUESTION}

A:</code></code></pre><h3>4.2 Zero-Shot-CoT</h3><ul><li><p>Requires no exemplars</p></li><li><p>Appends a thought-inducing phrase to the prompt, such as:</p></li><li><p>"Let's think step by step."</p></li><li><p>"Let's work this out in a step by step way to be sure we have the right answer."</p></li><li><p>"First, let's think about this logically."</p></li><li><p>Task-agnostic and doesn't require exemplars</p></li></ul><h3>4.3 Step-Back Prompting</h3><ul><li><p>Modification of CoT</p></li><li><p>First asks a generic, high-level question about relevant concepts or facts</p></li><li><p>Then proceeds with reasoning</p></li><li><p>Has shown significant improvements on reasoning benchmarks</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!yp5_!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09661276-95b3-4cda-938a-33511ed3b362_1378x806.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!yp5_!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09661276-95b3-4cda-938a-33511ed3b362_1378x806.png 424w, https://substackcdn.com/image/fetch/$s_!yp5_!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09661276-95b3-4cda-938a-33511ed3b362_1378x806.png 848w, https://substackcdn.com/image/fetch/$s_!yp5_!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09661276-95b3-4cda-938a-33511ed3b362_1378x806.png 1272w, https://substackcdn.com/image/fetch/$s_!yp5_!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09661276-95b3-4cda-938a-33511ed3b362_1378x806.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!yp5_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09661276-95b3-4cda-938a-33511ed3b362_1378x806.png" width="1378" height="806" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/09661276-95b3-4cda-938a-33511ed3b362_1378x806.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:806,&quot;width&quot;:1378,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:141390,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!yp5_!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09661276-95b3-4cda-938a-33511ed3b362_1378x806.png 424w, https://substackcdn.com/image/fetch/$s_!yp5_!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09661276-95b3-4cda-938a-33511ed3b362_1378x806.png 848w, https://substackcdn.com/image/fetch/$s_!yp5_!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09661276-95b3-4cda-938a-33511ed3b362_1378x806.png 1272w, https://substackcdn.com/image/fetch/$s_!yp5_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F09661276-95b3-4cda-938a-33511ed3b362_1378x806.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/pdf/2310.06117</p><h3>4.4 Analogical Prompting</h3><ul><li><p>Similar to Self-Generated In-Context Learning (SG-ICL)</p></li><li><p>Automatically generates exemplars that include Chains of Thought</p></li><li><p>Demonstrated improvements in mathematical reasoning and code generation tasks</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!u43p!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!u43p!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png 424w, https://substackcdn.com/image/fetch/$s_!u43p!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png 848w, https://substackcdn.com/image/fetch/$s_!u43p!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png 1272w, https://substackcdn.com/image/fetch/$s_!u43p!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!u43p!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png" width="1290" height="570" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:570,&quot;width&quot;:1290,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:184504,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!u43p!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png 424w, https://substackcdn.com/image/fetch/$s_!u43p!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png 848w, https://substackcdn.com/image/fetch/$s_!u43p!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png 1272w, https://substackcdn.com/image/fetch/$s_!u43p!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F29ee4f21-7499-4f0d-9ae7-8caf951aa44c_1290x570.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/pdf/2310.01714</p><h3>4.5 Thread-of-Thought (ThoT) Prompting</h3><ul><li><p>Uses an improved thought inducer: "Walk me through this context in manageable parts step by step, summarizing and analyzing as we go."</p></li><li><p>Particularly effective in question-answering and retrieval settings with large, complex contexts</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!dKHB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!dKHB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png 424w, https://substackcdn.com/image/fetch/$s_!dKHB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png 848w, https://substackcdn.com/image/fetch/$s_!dKHB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png 1272w, https://substackcdn.com/image/fetch/$s_!dKHB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!dKHB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png" width="1456" height="459" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:459,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:383679,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!dKHB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png 424w, https://substackcdn.com/image/fetch/$s_!dKHB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png 848w, https://substackcdn.com/image/fetch/$s_!dKHB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png 1272w, https://substackcdn.com/image/fetch/$s_!dKHB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff674fd88-bd44-47ad-b0dd-81ab69c32935_2496x786.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><h3>4.6 Tabular Chain-of-Thought (Tab-CoT)</h3><ul><li><p>Zero-Shot CoT prompt that makes the AI output reasoning as a markdown table</p></li><li><p>Tabular design improves the structure and reasoning of the output</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!8Nmb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!8Nmb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png 424w, https://substackcdn.com/image/fetch/$s_!8Nmb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png 848w, https://substackcdn.com/image/fetch/$s_!8Nmb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png 1272w, https://substackcdn.com/image/fetch/$s_!8Nmb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!8Nmb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png" width="1252" height="1270" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1270,&quot;width&quot;:1252,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:248294,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!8Nmb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png 424w, https://substackcdn.com/image/fetch/$s_!8Nmb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png 848w, https://substackcdn.com/image/fetch/$s_!8Nmb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png 1272w, https://substackcdn.com/image/fetch/$s_!8Nmb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6e481763-d095-4b0e-bdc9-ee45211bd20a_1252x1270.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!en2K!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!en2K!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png 424w, https://substackcdn.com/image/fetch/$s_!en2K!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png 848w, https://substackcdn.com/image/fetch/$s_!en2K!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png 1272w, https://substackcdn.com/image/fetch/$s_!en2K!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!en2K!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png" width="1456" height="620" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:620,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:251558,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!en2K!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png 424w, https://substackcdn.com/image/fetch/$s_!en2K!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png 848w, https://substackcdn.com/image/fetch/$s_!en2K!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png 1272w, https://substackcdn.com/image/fetch/$s_!en2K!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d7b0c9f-e944-42a6-9c5e-8282bae0ec6c_2518x1072.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/pdf/2305.17812</p><p>These Thought Generation techniques aim to improve the AI's problem-solving capabilities by making its reasoning process more explicit and structured. They can be particularly useful for complex tasks that require step-by-step reasoning or analysis.</p><h3>4.7 Few-Shot Chain-of-Thought (CoT) Techniques</h3><p>Few-Shot CoT techniques present the AI with multiple exemplars that include chains-of-thought. This approach can significantly enhance performance and is sometimes referred to as Manual-CoT or Golden CoT.</p><h4>4.7.1 Contrastive Chain-of-Thought</h4><ul><li><p>Shows the AI how not to reason</p></li><li><p>Has shown significant improvement in areas like Arithmetic Reasoning and Factual QA</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kFOr!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdf609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kFOr!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdf609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png 424w, https://substackcdn.com/image/fetch/$s_!kFOr!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdf609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png 848w, https://substackcdn.com/image/fetch/$s_!kFOr!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdf609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png 1272w, https://substackcdn.com/image/fetch/$s_!kFOr!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdf609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kFOr!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdf609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png" width="1456" height="903" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/df609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:903,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:710204,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kFOr!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdf609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png 424w, https://substackcdn.com/image/fetch/$s_!kFOr!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdf609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png 848w, https://substackcdn.com/image/fetch/$s_!kFOr!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdf609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png 1272w, https://substackcdn.com/image/fetch/$s_!kFOr!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdf609914-f956-48de-842d-c1d84ccbbbb8_2592x1608.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/pdf/2311.09277</p><h4>4.7.2 Uncertainty-Routed CoT Prompting</h4><ol><li><p>Samples multiple CoT reasoning paths</p></li><li><p>Demonstrates improvement on both GPT-4 and Gemini Ultra models</p></li></ol><p><em>Note: Didn&#8217;t read in full</em></p><p>https://storage.googleapis.com/deepmind-media/gemini/gemini_1_report.pdf</p><h4>4.7.3 Complexity-based Prompting</h4><ul><li><p>Two major modifications to CoT:</p></li></ul><ol><li><p>Selects complex examples for annotation and inclusion in the prompt</p></li><li><p>During inference, samples multiple reasoning chains and uses a majority vote among chains exceeding a certain length threshold</p></li></ol><ul><li><p>Premise: Longer reasoning indicates higher answer quality</p></li><li><p>Has shown improvements on <strong> math</strong> reasoning datasets</p></li></ul><p><em>Note: GPT-3 / Codex and tested </em></p><p>https://arxiv.org/pdf/2210.00720</p><h4>4.7.4 Active Prompting</h4><ol><li><p>Starts with some training questions/exemplars</p></li><li><p>Asks the AI to solve them</p></li><li><p>Calculates uncertainty (disagreement in this case)</p></li><li><p>Asks human annotators to rewrite the exemplars with highest uncertainty</p></li></ol><p><em>Note: another math-focused, but this could likely work well with reasoning q too.</em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!qpPY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!qpPY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png 424w, https://substackcdn.com/image/fetch/$s_!qpPY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png 848w, https://substackcdn.com/image/fetch/$s_!qpPY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png 1272w, https://substackcdn.com/image/fetch/$s_!qpPY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!qpPY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png" width="1456" height="693" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:693,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:326735,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!qpPY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png 424w, https://substackcdn.com/image/fetch/$s_!qpPY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png 848w, https://substackcdn.com/image/fetch/$s_!qpPY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png 1272w, https://substackcdn.com/image/fetch/$s_!qpPY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a9c3733-4114-4a5a-8a74-bb77f63dce9b_1978x942.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>https://arxiv.org/pdf/2302.12246</p><h4>4.7.5 Memory-of-Thought Prompting</h4><ul><li><p>Using unlabeled training exemplars to build Few-Shot CoT prompts at test time</p></li><li><p>Before test time: Performs inference on unlabeled training exemplars with CoT</p></li><li><p>At test time: Retrieves similar instances to the test sample</p></li><li><p>Improvements for Arithmetic, commonsense, and factual reasoning</p></li></ul><p><em>Note: this one I didn&#8217;t understand the proposed architecture in practice</em></p><p>https://arxiv.org/pdf/2305.05181</p><h4>4.7.6 Automatic Chain-of-Thought (Auto-CoT) Prompting</h4><ol><li><p>Uses Zero-Shot prompt to automatically generate chains of thought</p></li><li><p>Uses these generated chains to build a Few-Shot CoT prompt for a test sample</p></li></ol><p><em>Note: GPT-3</em></p><p>https://arxiv.org/pdf/2210.03493</p><h2>5. Decomposition Techniques</h2><p>Decomposition techniques focus on breaking down complex problems into simpler sub-questions, which can significantly improve AI problem-solving abilities. While some of these techniques are similar to thought-inducing methods like Chain-of-Thought (CoT), they explicitly focus on problem breakdown.</p><h3>5.1 Least-to-Most Prompting</h3><ol><li><p>Prompts the AI to break a problem into sub-problems without solving them</p></li><li><p>Solves sub-problems sequentially, appending responses to the prompt each time</p></li><li><p>Arrives at a final result</p><p></p></li></ol><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!qRSj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!qRSj!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png 424w, https://substackcdn.com/image/fetch/$s_!qRSj!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png 848w, https://substackcdn.com/image/fetch/$s_!qRSj!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png 1272w, https://substackcdn.com/image/fetch/$s_!qRSj!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!qRSj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png" width="1456" height="942" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:942,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:462581,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!qRSj!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png 424w, https://substackcdn.com/image/fetch/$s_!qRSj!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png 848w, https://substackcdn.com/image/fetch/$s_!qRSj!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png 1272w, https://substackcdn.com/image/fetch/$s_!qRSj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd1a452ed-a3d4-4994-b1d3-22cd72383ab0_1568x1014.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/pdf/2205.10625</p><h3>5.2 Decomposed Prompting (DECOMP)</h3><ol><li><p>Uses Few-Shot prompting to teach the AI how to use certain functions (e.g., string splitting, internet searching)</p></li><li><p>AI breaks down the original problem into sub-problems and sends them to different functions</p></li><li><p>Functions are often implemented as separate AI calls</p></li></ol><ul><li><p>Has shown improved performance over Least-to-Most prompting on some tasks</p></li></ul><p><em>Note: GPT3</em></p><p>https://arxiv.org/pdf/2205.10625</p><h3>5.3 Plan-and-Solve Prompting</h3><ul><li><p>Uses an improved Zero-Shot CoT prompt: "Let's first understand the problem and devise a plan to solve it. Then, let's carry out the plan and solve the problem step by step"</p></li><li><p>Generates more robust reasoning processes than standard Zero-Shot-CoT on multiple reasoning datasets</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!n8L4!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!n8L4!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png 424w, https://substackcdn.com/image/fetch/$s_!n8L4!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png 848w, https://substackcdn.com/image/fetch/$s_!n8L4!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png 1272w, https://substackcdn.com/image/fetch/$s_!n8L4!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!n8L4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png" width="1456" height="688" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:688,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:905173,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!n8L4!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png 424w, https://substackcdn.com/image/fetch/$s_!n8L4!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png 848w, https://substackcdn.com/image/fetch/$s_!n8L4!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png 1272w, https://substackcdn.com/image/fetch/$s_!n8L4!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F053bd47a-71e1-40f9-957b-813d5fdf143c_1978x934.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/pdf/2305.04091</p><h3>5.4 Tree-of-Thought (ToT)</h3><ol><li><p>Creates a tree-like search problem starting with an initial problem</p></li><li><p>Generates multiple possible steps in the form of thoughts (as from a CoT)</p></li><li><p>Evaluates the progress of each step towards solving the problem (through prompting)</p></li><li><p>Decides which steps to continue with and keeps creating more thoughts</p></li></ol><ul><li><p>Particularly effective for tasks that require search and planning</p></li></ul><p>https://github.com/princeton-nlp/tree-of-thought-llm </p><p>(relevant <a href="https://github.com/princeton-nlp/tree-of-thought-llm/blob/master/src/tot/prompts/text.py">prompts</a>)</p><h3>5.5 Recursion-of-Thought</h3><ol><li><p>Similar to regular CoT, but when it encounters a complicated problem in the reasoning chain, it sends this problem into another prompt/AI call</p></li><li><p>Inserts the answer from the sub-problem back into the original prompt</p></li><li><p>Can recursively solve complex problems, including those that might exceed the maximum context length</p></li></ol><ul><li><p>Has shown improvements on arithmetic and algorithmic tasks</p></li></ul><h3>5.6 Faithful Chain-of-Thought</h3><ul><li><p>Generates a CoT that includes both natural language and symbolic language (e.g., Python) reasoning</p></li><li><p>Uses different types of symbolic languages in a task-dependent fashion</p></li></ul><p><em>Note: can&#8217;t think of a practical use for this, still cool nonetheless</em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!j8PQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!j8PQ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png 424w, https://substackcdn.com/image/fetch/$s_!j8PQ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png 848w, https://substackcdn.com/image/fetch/$s_!j8PQ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png 1272w, https://substackcdn.com/image/fetch/$s_!j8PQ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!j8PQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png" width="1456" height="552" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:552,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:236897,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!j8PQ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png 424w, https://substackcdn.com/image/fetch/$s_!j8PQ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png 848w, https://substackcdn.com/image/fetch/$s_!j8PQ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png 1272w, https://substackcdn.com/image/fetch/$s_!j8PQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0b5c3b1f-9fa7-4b22-a216-0fd85e2a2861_1668x632.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><h3>5.7 Skeleton-of-Thought</h3><ol><li><p>Prompts the AI to create a skeleton of the answer (sub-problems to be solved)</p></li><li><p>Sends these questions to an AI in parallel</p></li><li><p>Concatenates all the outputs to get a final response</p></li></ol><ul><li><p>Focuses on accelerating answer speed through parallelization</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!O185!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!O185!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png 424w, https://substackcdn.com/image/fetch/$s_!O185!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png 848w, https://substackcdn.com/image/fetch/$s_!O185!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png 1272w, https://substackcdn.com/image/fetch/$s_!O185!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!O185!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png" width="1456" height="1857" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1857,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:762690,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!O185!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png 424w, https://substackcdn.com/image/fetch/$s_!O185!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png 848w, https://substackcdn.com/image/fetch/$s_!O185!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png 1272w, https://substackcdn.com/image/fetch/$s_!O185!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb8b742d0-6b19-45a5-8839-97eb5169ff4e_1510x1926.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/pdf/2307.15337</p><h2>6. Ensembling Techniques</h2><p>Ensembling involves using multiple prompts to solve the same problem and then aggregating these responses into a final output. These techniques often improve accuracy and reduce output variance but increase cost.</p><h3>6.1 Demonstration Ensembling (DENSE)</h3><ol><li><p>Creates multiple few-shot prompts, each with a distinct subset of exemplars from the training set</p></li><li><p>Aggregates outputs to generate a final response</p></li></ol><h3>6.2 Mixture of Reasoning Experts (MoRE)</h3><ol><li><p>Creates diverse reasoning experts using specialized prompts for different reasoning types</p></li><li><p>Selects the best answer based on an agreement score</p></li></ol><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!sb4t!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!sb4t!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png 424w, https://substackcdn.com/image/fetch/$s_!sb4t!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png 848w, https://substackcdn.com/image/fetch/$s_!sb4t!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png 1272w, https://substackcdn.com/image/fetch/$s_!sb4t!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!sb4t!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png" width="966" height="646" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:646,&quot;width&quot;:966,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:152393,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!sb4t!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png 424w, https://substackcdn.com/image/fetch/$s_!sb4t!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png 848w, https://substackcdn.com/image/fetch/$s_!sb4t!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png 1272w, https://substackcdn.com/image/fetch/$s_!sb4t!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc4ebf697-6ef7-4202-b893-bcb9450298d1_966x646.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/pdf/2305.14628</p><h3>6.3 Max Mutual Information Method</h3><ol><li><p>Creates multiple prompt templates with varied styles and exemplars</p></li><li><p>Selects the optimal template that maximizes mutual information between the prompt and the AI's outputs</p></li></ol><h3>6.4 Self-Consistency</h3><ol><li><p>Samples multiple reasoning paths for a given input</p></li><li><p>Selects the most consistent output as the final response</p></li></ol><ul><li><p>Has shown improvements on arithmetic, commonsense, and symbolic reasoning tasks</p></li></ul><h3>6.5 Universal Self-Consistency</h3><ul><li><p>Similar to Self-Consistency, but uses a prompt template to select the majority answer</p></li><li><p>Helpful for free-form text generation and cases where the same answer may be output differently by different prompts</p></li></ul><p>https://arxiv.org/pdf/2311.17311</p><h3>6.6 Meta-Reasoning over Multiple CoTs</h3><ol><li><p>Generates multiple reasoning chains for a given problem</p></li><li><p>Inserts all chains into a single prompt template</p></li><li><p>Generates a final answer from the combined chains</p></li></ol><p>https://arxiv.org/pdf/2304.13007</p><h3>6.7 DiVeRSe</h3><ol><li><p>Creates multiple prompts for a given problem</p></li><li><p>Performs Self-Consistency for each, generating multiple reasoning paths</p></li><li><p>Scores reasoning paths based on each step</p></li><li><p>Selects a final response based on scores</p></li></ol><h3>6.8 Consistency-based Self-adaptive Prompting (COSP)</h3><ol><li><p>Runs Zero-Shot CoT with Self-Consistency on a set of examples</p></li><li><p>Selects a high agreement subset of outputs to be included in the final prompt as exemplars</p></li><li><p>Performs Self-Consistency again with this final prompt</p></li></ol><h3>6.9 Universal Self-Adaptive Prompting (USP)</h3><ul><li><p>Builds upon COSP to make it generalizable to all tasks</p></li><li><p>Uses unlabeled data to generate exemplars</p></li><li><p>Employs a more complicated scoring function to select exemplars</p></li><li><p>Does not use Self-Consistency</p></li></ul><h3>6.10 Prompt Paraphrasing</h3><ul><li><p>Transforms an original prompt by changing some wording while maintaining overall meaning</p></li><li><p>Acts as a data augmentation technique for generating prompts for an ensemble</p></li></ul><p><em>Note: 2020, still makes sense though</em></p><h2>7. Self-Criticism Techniques</h2><p>Self-Criticism techniques involve having AI models evaluate and improve their own outputs. These methods can range from simple judgments to providing detailed feedback for improvement.</p><h3>7.1 Self-Calibration</h3><ol><li><p>Prompts the AI to answer a question</p></li><li><p>Builds a new prompt including the question, the AI's answer, and an instruction asking whether the answer is correct</p></li></ol><ul><li><p>Useful for gauging confidence levels and deciding when to accept or revise the original answer</p></li></ul><p>https://arxiv.org/pdf/2207.05221</p><h3>7.2 Self-Refine</h3><ol><li><p>Gets an initial answer from the AI</p></li><li><p>Prompts the AI to provide feedback on its own answer</p></li><li><p>Prompts the AI to improve the answer based on the feedback</p></li><li><p>Repeats the process until a stopping condition is met</p></li></ol><ul><li><p>Has demonstrated improvement across reasoning, coding, and generation tasks</p></li></ul><p>https://arxiv.org/pdf/2303.17651</p><h3>7.3 Reversing Chain-of-Thought (RCoT)</h3><ol><li><p>Prompts the AI to reconstruct the problem based on its generated answer</p></li><li><p>Generates fine-grained comparisons between the original and reconstructed problems to check for inconsistencies</p></li><li><p>Converts inconsistencies to feedback for the AI to revise the generated answer</p></li></ol><h3>7.4 Self-Verification</h3><ol><li><p>Generates multiple candidate solutions with Chain-of-Thought (CoT)</p></li><li><p>Scores each solution by masking parts of the original question and asking the AI to predict them based on the rest of the question and the generated solution</p></li></ol><ul><li><p>Has shown improvement on eight reasoning datasets</p></li></ul><h3>7.5 Chain-of-Verification (COVE)</h3><ol><li><p>Uses the AI to generate an answer to a given question</p></li><li><p>Creates a list of related questions to help verify the correctness of the answer</p></li><li><p>Each verification question is answered by the AI</p></li><li><p>All information is given to the AI to produce the final revised answer</p></li></ol><ul><li><p>Improvements in various question-answering and text-generation tasks</p></li></ul><h3>7.6 Cumulative Reasoning</h3><ol><li><p>Generates several potential steps in answering the question</p></li><li><p>Has the AI evaluate these steps, deciding to accept or reject each</p></li><li><p>Checks whether it has arrived at the final answer</p></li><li><p>If not, repeats the process</p></li></ol><ul><li><p>Better output in logical inference tasks and mathematical problems</p></li></ul><h2>8. Evaluation Techniques</h2><p>LLMs can be used as evaluators, extracting and reasoning about information to assess the quality of texts or other LLM outputs. This section covers key components of evaluation frameworks using LLMs.</p><h3>8.1 Prompting Techniques for Evaluation</h3><p>Evaluation prompts often benefit from standard text-based prompting techniques, including:</p><ul><li><p><strong>In-Context Learning</strong>: Provides examples of evaluations to guide the model.</p></li><li><p><strong>Role-based Evaluation</strong>: Assigns specific roles to the LLM to generate diverse evaluations or create a multi-agent debate setting.</p></li><li><p><strong>Chain-of-Thought</strong>: Improves evaluation performance by encouraging step-by-step reasoning.</p></li><li><p><strong>Model-Generated Guidelines</strong>: Prompts the LLM to generate its own evaluation guidelines, reducing inconsistencies from ill-defined scoring criteria.</p></li></ul><h3>8.2 Output Formats</h3><p>The format of the LLM's evaluation output can significantly affect performance:</p><ul><li><p><strong>Styling</strong>: Using XML or JSON formatting can improve judgment accuracy.</p></li><li><p><strong>Linear Scale</strong>: Simple numerical scales (e.g., 1-5, 1-10, 0-1) for scoring.</p></li></ul><p>Example:</p><pre><code><code>Score the following story on a scale of 1-5 from well to poorly written: {INPUT}</code></code></pre><ul><li><p><strong>Binary Score</strong>: Yes/No or True/False responses for simple judgments.</p></li></ul><p>Example:</p><pre><code><code>Is the following story well written at a high-school level (yes/no)?: {INPUT}</code></code></pre><ul><li><p><strong>Likert Scale</strong>: Provides a more nuanced understanding of the scale.</p></li></ul><p>Example:</p><pre><code><code>Score the following story according to the following scale:

Poor Acceptable Good Very Good Incredible

{INPUT}</code></code></pre><h3>8.3 Prompting Frameworks for Evaluation</h3><p>These frameworks provide structured approaches to using LLMs for evaluation tasks:</p><h4>8.3.1 LLM-EVAL</h4><ul><li><p>One of the simplest evaluation frameworks</p></li><li><p>Uses a single prompt containing:</p></li></ul><ol><li><p>A schema of variables to evaluate (e.g., grammar, relevance)</p></li><li><p>An instruction telling the model to output scores for each variable within a certain range</p></li><li><p>The content to evaluate</p></li></ol><p><em>Notes: this works so well</em></p><p>https://arxiv.org/pdf/2305.13711</p><h4>8.3.2 G-EVAL</h4><ul><li><p>Similar to LLM-EVAL, but includes AutoCoT (Automatic Chain-of-Thought) steps in the prompt itself</p></li><li><p>Process:</p></li></ul><ol><li><p>Generates evaluation steps according to the instructions</p></li><li><p>Inserts these steps into the final prompt</p></li><li><p>Weights answers according to token probabilities</p></li></ol><pre><code><code>Evaluate Coherence in the Summarization Task

You will be given one summary written for a news article.

Your task is to rate the summary on one metric.

Please make sure you read and understand these instructions carefully. Please keep this document open while reviewing, and refer to it as needed.

Evaluation Criteria:

Coherence (1-5) - the collective quality of all sentences. We align this dimension with the DUC quality question of structure and coherence whereby &#8221;the summary should be well-structured and well-organized. The summary should not just be a heap of related informa- tion, but should build from sentence to sentence to a coherent body of information about a topic.&#8221;

Evaluation Steps:

1. Read the news article carefully and identify the main topic and key points.

2. Read the summary and compare it to the news article. Check if the summary covers the main topic and key points of the news article, and if it presents them in a clear and logical order.

3. Assign a score for coherence on a scale of 1 to 5, where 1 is the lowest and 5 is the highest based on the Evaluation Criteria.

Example: Source Text: 

{{Document}} 

Summary: 

{{Summary}}

Evaluation Form (scores ONLY): 

- Coherence:</code></code></pre><p>https://arxiv.org/pdf/2303.16634</p><h4>8.3.3 ChatEval</h4><ul><li><p>Uses a multi-agent debate framework</p></li><li><p>Each agent has a separate role in the evaluation process</p></li><li><p>Allows for diverse perspectives and more nuanced evaluations</p></li></ul><p>These frameworks demonstrate different approaches to structuring LLM-based evaluations, from simple single-prompt methods to more complex multi-agent systems. The choice of framework can depend on the complexity of the evaluation task and the desired level of detail and diversity in the assessment.</p><h3>8.4 Other Evaluation Methodologies</h3><p>In addition to explicit prompting for quality assessments, there are several other methodologies for LLM-based evaluation:</p><h4>8.4.1 Implicit Scoring</h4><blockquote><p>While most approaches directly prompt the LLM to generate a quality assessment&nbsp;<strong>(explicit)</strong>, some works also use implicit scoring where a quality score is derived using the model&#8217;s confidence in its prediction (Chen et al., 2023g) or the likelihood of generating the output (Fu et al., 2023a) or via the models&#8217; explanation (e.g. count the number of errors as in Fernandes et al. (2023); Kocmi and Federmann (2023a)) or via evaluation on proxy tasks (factual inconsistency via entailment as in Luo et al. (2023))</p></blockquote><p><em>TO-DO: Implicit has worked for most of my needs, I need to check these individual papers later</em></p><h4>8.4.2 Batch Prompting</h4><ul><li><p>Evaluates multiple instances at once or the same instance under different criteria/roles. Improves compute and cost efficiency</p></li></ul><h4>8.4.3 Pairwise Evaluation vs Pointwise</h4><p>Directly comparing the quality of two texts (<em>Pairwise Evaluation)</em> may lead to suboptimal results and that explicitly asking LLM to generate a score for individual summaries is the most effective and reliable method (<strong>Pointwise</strong>). The order of the inputs for pairwise comparisons can also heavily affect evaluation.</p><p><em>Note: relevant for summarization</em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!5386!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!5386!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png 424w, https://substackcdn.com/image/fetch/$s_!5386!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png 848w, https://substackcdn.com/image/fetch/$s_!5386!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png 1272w, https://substackcdn.com/image/fetch/$s_!5386!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!5386!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png" width="1456" height="749" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:749,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:454615,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!5386!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png 424w, https://substackcdn.com/image/fetch/$s_!5386!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png 848w, https://substackcdn.com/image/fetch/$s_!5386!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png 1272w, https://substackcdn.com/image/fetch/$s_!5386!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F55412dbc-2a98-4ce8-90f0-1e952b262e72_1552x798.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/html/2406.12319v1</p><h2>9. Prompting Issues: Security Concerns</h2><p>https://arxiv.org/pdf/2311.16119</p><h3>9.1 Types of Prompt Hacking</h3><h4>9.1.1 Prompt Injection</h4><ul><li><p>Overrides original developer instructions in the prompt with user input</p></li><li><p>Exploits the LLM's inability to distinguish between developer instructions and user input</p></li></ul><h4>9.1.2 Jailbreaking</h4><ul><li><p>Gets an LLM to perform unintended actions through prompting</p></li><li><p>Can be an architectural or training problem</p></li></ul><h3>9.2 Risks of Prompt Hacking</h3><h4>9.2.1 Data Privacy</h4><ul><li><p>Training Data Reconstruction: Extracting training data from LLMs</p></li><li><p>Prompt Leaking: Extracting the prompt template from an application</p></li></ul><h3>9.3 Hardening Measures</h3><p>Some mitigation strategies:</p><h4>9.3.1 Prompt-based Defenses</h4><ul><li><p>Include instructions in the prompt to avoid prompt injection</p></li></ul><h4>9.3.2 Guardrails</h4><ul><li><p>Rules and frameworks for guiding LLM outputs</p></li><li><p>Range from simple classification of user input to complex dialogue managers</p></li></ul><h4>9.3.3 Detectors</h4><ul><li><p>Tools designed to detect malicious inputs</p></li><li><p>Often built using fine-tuned models trained on malicious prompts</p></li><li><p>Generally more effective than prompt-based defenses</p></li></ul><h2>10. Prompting Issues: Alignment Concerns</h2><h3>10.1 Prompt Sensitivity</h3><p>LLMs are highly sensitive to input prompts, with even subtle changes resulting in vastly different outputs.</p><h4>10.1.1 Task Format</h4><ul><li><p>Different ways of phrasing the same task can significantly alter LLM performance.</p></li><li><p>Example: In sentiment analysis, asking to classify a review as "positive" or "negative" versus asking "Is this review positive?" (yes/no) can alter GPT-3's accuracy by up to 30% (Zhao et al., 2021b). <strong>(NOTE: I don&#8217;t think is relevant with GPT-4s anymore)</strong></p></li><li><p>Even logically equivalent prompts, such as altering the order of choices in multiple-choice questions, can lead to significant performance degradation. <strong>(NOTE: same, don&#8217;t think it&#8217;s relevant anymore)</strong></p></li></ul><h4>10.1.2 Prompt Drift</h4><ul><li><p>Occurs when the model behind an API changes over time, causing the same prompt to produce different results on the updated model.</p></li></ul><h3>10.2 Overconfidence and Calibration</h3><p>LLMs often express overconfidence in their answers, which can lead to user overreliance on model outputs.</p><h4>10.2.1 Verbalized Score</h4><ul><li><p>A simple calibration technique that generates a confidence score (e.g., "How confident are you from 1 to 10"). </p></li><li><p>Efficacy is debated:</p></li><li><p>1 paper finds that LLMs are highly overconfident when verbalizing confidence scores, even when using self-consistency and chain-of-thought.</p></li><li><p>In contrast, another finds that simple prompts can achieve more accurate calibration than the model's output token probabilities.</p></li></ul><p><em>Note: I have pretty good results with confidence scores on my use cases so far</em></p><h4>10.2.2 Sycophancy</h4><ul><li><p>Refers to LLMs' tendency to agree with the user, even when contradicting their own initial output.</p></li><li><p>LLMs are easily swayed if the user's opinion is included in the prompt (e.g., "I really like/dislike this argument").</p></li><li><p>Questioning the LLM's original answer, strongly providing an assessment of correctness, or adding false assumptions can completely change the model output.</p></li><li><p>Sycophancy is heightened for larger and instruction-tuned models.</p></li><li><p>Solution: Avoid including personal opinions in prompts to prevent undue influence.</p></li></ul><h3>10.3 Biases, Stereotypes, and Culture</h3><p>Efforts to ensure fairness and cultural sensitivity in LLM outputs include:</p><h4>10.3.1 Vanilla Prompting</h4><ul><li><p>Simply instructing the LLM to be unbiased in the prompt.</p></li><li><p>Also referred to as moral self-correction.</p></li></ul><h4>10.3.2 Selecting Balanced Demonstrations</h4><ul><li><p>Using demonstrations optimized for fairness metrics.</p></li><li><p>Can help reduce biases in model outputs.</p></li></ul><h4>10.3.3 Cultural Awareness</h4><ul><li><p>Injecting cultural context into prompts to help LLMs with cultural adaptation.</p></li><li><p>Techniques include:</p></li></ul><ol><li><p>Asking the LLM to refine its own output.</p></li><li><p>Instructing the LLM to use culturally relevant words.</p></li></ol><h4>10.3.4 AttrPrompt</h4><ul><li><p>Designed to avoid biases in synthetic data generation .</p></li><li><p>Two-step process:</p></li></ul><ol><li><p>Asks the LLM to generate specific attributes important for diversity (e.g., location).</p></li><li><p>Prompts the LLM to generate synthetic data by varying each of these attributes.</p></li></ol><h3>10.4 Ambiguity</h3><p>Addressing challenges posed by ambiguous questions that can be interpreted in multiple ways:</p><h4>10.4.1 Ambiguous Demonstrations</h4><ul><li><p>Including examples with ambiguous label sets in prompts.</p></li><li><p>Can be automated with a retriever or done manually.</p></li><li><p>Shown to increase In-Context Learning performance.</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!0LdX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!0LdX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png 424w, https://substackcdn.com/image/fetch/$s_!0LdX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png 848w, https://substackcdn.com/image/fetch/$s_!0LdX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png 1272w, https://substackcdn.com/image/fetch/$s_!0LdX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!0LdX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png" width="1456" height="976" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:976,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:404801,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!0LdX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png 424w, https://substackcdn.com/image/fetch/$s_!0LdX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png 848w, https://substackcdn.com/image/fetch/$s_!0LdX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png 1272w, https://substackcdn.com/image/fetch/$s_!0LdX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F61e30557-4795-41b0-b3cc-c228dfe8b27f_2064x1384.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/pdf/2309.07900</p><h4>10.4.2 Question Clarification</h4><ul><li><p>Allows the LLM to identify ambiguous questions and generate clarifying questions.</p></li><li><p>Multi-step process:</p></li></ul><ol><li><p>Generate an initial answer.</p></li><li><p>Classify whether to generate clarification questions or return the initial answer.</p></li><li><p>Decide what clarification questions to generate.</p></li><li><p>Generate a final answer based on clarifications.</p></li></ol><p>These alignment issues highlight the complexity of ensuring LLMs behave as intended across various contexts and user needs. Addressing these concerns is crucial for the responsible and effective deployment of LLM-based systems.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!fLrW!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!fLrW!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png 424w, https://substackcdn.com/image/fetch/$s_!fLrW!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png 848w, https://substackcdn.com/image/fetch/$s_!fLrW!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png 1272w, https://substackcdn.com/image/fetch/$s_!fLrW!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!fLrW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png" width="1456" height="775" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:775,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:350361,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!fLrW!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png 424w, https://substackcdn.com/image/fetch/$s_!fLrW!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png 848w, https://substackcdn.com/image/fetch/$s_!fLrW!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png 1272w, https://substackcdn.com/image/fetch/$s_!fLrW!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1ebfb750-a05a-41ac-88b7-e1cc15c6548c_1852x986.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>https://arxiv.org/pdf/2311.09469</p>]]></content:encoded></item><item><title><![CDATA[One Month Since Leaving Canopy]]></title><description><![CDATA[Coffee Chats, Meetups, Tailwind, React, SAML, Eval, Extensive Prompt Engineering, and Replay Journal]]></description><link>https://journal.daniellopes.dev/p/one-month-since-leaving-canopy</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/one-month-since-leaving-canopy</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Sat, 08 Jun 2024 04:30:37 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!3GxJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>It's been <a href="https://journal.daniellopes.dev/p/end-of-a-7-year-journey">exactly one month since</a> I left my full-time job at Canopy to focus on studying (mostly LLMs), building things for myself, and figuring out what's next. This first month flew by!</p><h2>Ideation</h2><p>I spent a lot of the first week on ideation, picking interesting things I can build alone while learning product engineering and practical LLM applications. I'm currently reading <a href="https://www.amazon.com/Disciplined-Entrepreneurship-Steps-Successful-Startup/dp/1118692284">"Disciplined Entrepreneurship: 24 Steps to a Successful Startup,"</a> recommended by my friend <a href="https://danielraffel.me">Daniel Raffel</a>. While not directly applicable to small solo/indie projects that I&#8217;m focused on now, it's a great resource for potential future things.</p><h2>Introducing <a href="https://replayjournal.ai">Replay</a></h2><p>Most of my coding time this month was spent finishing Replay, a conversational journaling app I built for myself over the last few months. Journaling has been incredibly helpful for me, but I grew frustrated with existing apps. I believe LLMs can serve as a sounding board and provide context-aware probing. I'll write more about Replay in a separate post when I open it for signups. In the meantime, you can join the waitlist and see some features at <a href="https://replayjournal.ai">replayjournal.ai</a>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!3GxJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!3GxJ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png 424w, https://substackcdn.com/image/fetch/$s_!3GxJ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png 848w, https://substackcdn.com/image/fetch/$s_!3GxJ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png 1272w, https://substackcdn.com/image/fetch/$s_!3GxJ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!3GxJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png" width="1456" height="836" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:836,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:909169,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!3GxJ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png 424w, https://substackcdn.com/image/fetch/$s_!3GxJ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png 848w, https://substackcdn.com/image/fetch/$s_!3GxJ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png 1272w, https://substackcdn.com/image/fetch/$s_!3GxJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F81f6ec43-f327-46ef-8aaa-2c71bb36186c_4064x2334.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>Prompt Engineering</h2><p>In Replay, I used OpenAI assistants for everything, from categorization and emotion analysis to mood detection, topic extraction, and summarization. I also have assistants for each of the 20+ journaling types and one for SEO. I spent a significant amount of time this month doing prompt engineering. </p><p>I did a lot of work on iterating on different types of summarizers for a future project.</p><p>Here are four helpful resources:</p><p>1. <a href="https://docs.anthropic.com/en/docs/prompt-engineering">https://docs.anthropic.com/en/docs/prompt-engineering</a></p><p>2. <a href="https://platform.openai.com/docs/guides/prompt-engineering/strategy-give-models-time-to-think">https://platform.openai.com/docs/guides/prompt-engineering/strategy-give-models-time-to-think</a></p><p>3. <a href="https://www.promptingguide.ai">https://www.promptingguide.ai</a></p><p>4. <a href="https://github.com/danielmiessler/fabric/tree/main/patterns">https://github.com/danielmiessler/fabric/tree/main/patterns</a> (not a guide, but you can learn a lot from the source)</p><h2>Rails Boilerplate + Tailwind</h2><p>I spent a lot of time figuring out a good boilerplate for my next projects and learning Tailwind. The combination of Rails, <a href="https://jumpstartrails.com">Jumpstart Pro</a>, and Tailwind (especially <a href="https://tailwindui.com">TailwindUI</a>) has been a good choice. Tailwind has a weird learning curve and can be a pain for advanced things, but overall, I'm happy with it. Having everything in one file allows me to lean on a <a href="https://chatgpt.com/g/g-79HB6JkLa-react-next-tailwind">custom GPTs</a> in ChatGPT and <a href="https://v0.dev">V0</a> for quick initial versions.</p><p>It's easy to underestimate the time spent on non-core tasks, but they always add up to weeks or look unprofessional if neglected (e.g., authentication, CDNs, servers, meta descriptions, landing pages, uploads, Stripe, etc.).</p><h2>SAML + React</h2><p>Last couple of weeks, I helped my friends at <a href="https://getkoala.com">Koala</a> with their SAML integration. I like their team, product and space, and I'm close with one of the founders. They're solving issues I had when helping sales at Canopy. And they have a React + Rails stack, that I wanted to see how they managed their frontend choices. They did it very pragmatically, handling the frontend quite well while benefiting from Rails on the backend (no GraphQL, model duplication or API calls). I'll write a separate post on their setup another day.</p><h2>Canopy AI: Monitoring + Eval</h2><p>This month, I also spent time upgrading and improving the AI leadership coach inside <a href="https://canopy.is">Canopy</a>. It's been cool to monitor how people are using it, where it works well, and where it fails. I've started studying more about eval and assessing tools like Langsmith, Ragas, Guardrails, and <a href="https://github.com/openai/evals">OpenAI's framework</a>. The problem is that all of these require a Python (or JS stack). So, next month, I'll likely roll out <a href="https://athina.ai">Athina.ai</a> for Canopy to avoid introducing Python to the tech stack.</p><h2>Meetups and Networking</h2><p>One of the best things about being in San Francisco is the amount of people ot meet and events. I was negligent in staying connected with colleagues and friends while overworking at Canopy, but this month I started fixing that.</p><p>I also attended several meetups and virtual events, learning about AI agents, LLM-based crawling, accuracy, efficiency, RAGs, knowledge graphs, pre and post-processing, speed, and various architectures. My favorite meetups are hosted by the <a href="https://mlops.community">MLOps community</a>, and I'm excited to attend their upcoming conference: <a href="https://www.aiqualityconference.com">https://www.aiqualityconference.com.</a></p><p>This month I also started asking for help from my girlfriend (she's an ML Engineer/Data Scientist). So she's sharing more notes from conferences, papers, etc. This paper in particular I find fascinating: https://arxiv.org/pdf/2308.10053.</p><h2><strong>Plans for Next Month: Eval + Agents </strong></h2><p>In the coming month, I plan to:</p><p>1. Open Replay for all users within the next week or two.</p><p>2. Study eval while measuring results using production data from Canopy and Replay, as well as during the prototyping phase.</p><p>3. Spend time studying both LLM theory and traditional ML.</p><p>4. Start working on a new agent-related product by the end of the month.</p><p>&#8212;</p><p>It's been an exciting and productive first month. Excited for the second!</p>]]></content:encoded></item><item><title><![CDATA[W3 Recap: 4o, Marketing & Product work]]></title><description><![CDATA[3 days off but still managed to make some progress&#8230;]]></description><link>https://journal.daniellopes.dev/p/w3-recap-4o-marketing-and-product</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/w3-recap-4o-marketing-and-product</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Tue, 28 May 2024 03:45:17 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!DDfb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DDfb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DDfb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!DDfb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!DDfb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!DDfb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DDfb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/34d438af-6002-44ef-827b-5561049ff932_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2205357,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:&quot;&quot;,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!DDfb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!DDfb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!DDfb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!DDfb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F34d438af-6002-44ef-827b-5561049ff932_1456x832.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Context: In May I left my <a href="https://journal.daniellopes.dev/p/end-of-a-7-year-journey">full-time job</a> to start a 3-month LLM learning cycle by building a few products of my own. I&#8217;m documenting my journey here. Every week I post a recap of how the week went, things I learned, etc.</p><p>Had a bit of a shorter week. I went on a camping trip to Yosemite and had no service for 3 days. Despite this, still managed to pack 3 days of 10 hours.</p><p>Here's some of the progress and learnings:</p><h2>Canopy AI: JSON vs XML and streaming</h2><p>I'm still helping maintain the AI Assistant I built for <a href="https://canopy.is/">Canopy</a>. It has the usual Chat experience where a user asks a question, we do a RAG search and layer a bunch of things on top. </p><p>One frustrating aspect has been GPT-4's unreliability in returning JSON. OpenAI's Completion API has a JSON mode that ensures valid JSON output, but this option isn't available on the Assistant API when using the File Retrieval tool (their built-in RAG).</p><p>Although OpenAI's built-in RAG gives me better results compared to Pinecone and PG Vector, the trade-off for Canopy's use case was losing the JSON mode.</p><p>Canopy's assistant does more than just RAG the content. It runs categorization, sensitivity checks, confidence checks, and other tasks that might trigger subsequent LLM calls or be needed in the user interface. These constraints and extra information are metadata that I used to include in a JSON (making a single API call for speed and cost).</p><p>The problem with JSON is streaming. Ideally, you want to start rendering the message as you receive the response. In the JSON, I had a markdown answer as one of the values. I'd buffer the JSON until the right value starts streaming and relay that to the user interface, stopping when the value ends. A better solution would be to have the markdown first, followed by the metadata in a structured data format (JSON, XML, etc.). After a full day of prompt engineering, I couldn't consistently get GPT-4 to return the metadata in this format after the markdown message &#8211; until GPT-4o.</p><p>Last week, I spent a day migrating from the JSON format to a structure with markdown first, followed by the metadata inside XML. Anthropic uses XML in their examples, so I've been using this tip with OpenAI as well (makes sense if you think about it, it&#8217;s easier for transformers to handle words than curly braces).</p><p>The end format looks like this:</p><pre><code>Markdown answer&#8230;
```xml
&lt;metadata&gt;
&#8230;
&lt;/metadata&gt;
```</code></pre><p>Incorporating the ``` is another trick that took me a while to learn. When you can't use JSON mode, OpenAI will often add wrappers like these. Instead of trying to remove them in the prompt or handling failures in the app, including "shots" (examples) with the ``` as part of the answer will guide the model to always return it.</p><p>This simplified the code on our app side and made the metadata correct nearly 100% of the time. The next step is to auto-retry on the rare cases when it fails, which I haven't implemented yet.</p><h2>The New Product</h2><p>So far, in these 3 weeks, I've spent around 40 hours on the new product, with a significant portion of that time dedicated to boilerplate tasks that I won't need to repeat for future products, which is a positive.</p><p>While most of the AI work was already done, I couldn't resist adding some extra features at the last minute with GPT-4o. Scope creep? Why not!</p><p>I finally have a name &#8211;&nbsp;but I&#8217;ll share it once the landing page is up. Until now I had it under a code name, but now I have the final name, domain, and a marketing landing page 90% done.</p><p>Selecting the domain and name included initial SEO work. I think it&#8217;s BS that domains are not relevant for SEO anymore. If you are a nobody in page authority, I think it&#8217;s a good idea to make the domains very literal, so that&#8217;s how I&#8217;m starting: a short name followed by something literal.</p><p>In the 2 days I had after spending a day on Canopy, I worked on:</p><ul><li><p><strong>Feature work:</strong></p><ul><li><p>Refining the AI features and adding some extra parallel LLM calls.</p></li><li><p>Finished an Insights page (I hate styling Heatmaps).</p></li><li><p>Added dark mode support while stuck in traffic driving to Yosemite.</p></li><li><p>Started with a Telegram bot.</p></li></ul></li><li><p><strong>Marketing</strong></p><ul><li><p>Finished a landing page and registered the domain.</p></li><li><p>SEO: Research to find keywords I could try to rank for and figure out a strategy for evergreen content (I'll need more time here).</p></li></ul></li></ul><p>Before sharing the product with everyone, I still have to polish some final details, like setting up the Stripe account, deploying to the final servers, etc.</p><p><strong>Timeline-wise, I'm thinking I need:</strong></p><ul><li><p>1 more full week for polishing, details, and putting up the waitlist (including boring things like Terms of Service).</p></li><li><p>About 3 days for the initial marketing work, including backlink work, evergreen SEO-focused pages, and some community-based distribution.</p></li></ul><p>Then, I'm switching attention to the second product that I want for myself (while keeping 25% of my time on this one).</p><p>Once the product is live with users, I plan to share some of the lessons more openly by writing specific posts about certain things. </p><p>For example, on this project, I've been using a mix of multiple parallel LLM calls and OpenAI batch processing. These two things could be interesting as standalone posts. </p><p>I also hope to carve out time to write separate posts about some of the product and marketing aspects whenever I can.</p><h2>Week 4 Plan</h2><p>This week, I'll be splitting my work into three parts:</p><p>1 day for Canopy</p><p>2 days on New Product</p><p>3 days helping a friend with his startup (I usually work a 6-day week anyway)</p><p>I'm avoiding doing any kind of external work, but in this case, I made an exception. </p><p>It's a fixed-scope project on something I'm quite familiar with, plus it's an opportunity to learn some new React on a real product designed by great programmers. On top of that, what this friend is building is an impressive product in an interesting domain with a capable team, so I'll help whenever I can. I&#8217;m estimating probably 14 days on this (let&#8217;s see how off I&#8217;ll be this time).</p><p>This will delay my LLM learning cycle plan a bit, but it's a good trade-off.</p><p>I'll continue to share my updates here regardless.</p>]]></content:encoded></item><item><title><![CDATA[RANT: You're Using ChatGPT Wrong]]></title><description><![CDATA[This isn't polished tech where you can put in no effort and get results.]]></description><link>https://journal.daniellopes.dev/p/rant-youre-using-chatgpt-wrong</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/rant-youre-using-chatgpt-wrong</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Wed, 22 May 2024 20:34:33 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!gFCO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!gFCO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!gFCO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!gFCO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!gFCO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!gFCO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!gFCO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1924794,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!gFCO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!gFCO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!gFCO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!gFCO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63acc7b0-0635-48ac-b36b-1dfe896e3acb_1456x832.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Most of the LLM criticism I see is easily fixable if people have a basic understanding of tech. OpenAI shares some blame for this, since their chat UI has become the standard but explains nothing and tries to be generic, leaning in on zero-shot to seem magical, leading to poor results out of the box. </p><p>Previously, people were using the inferior free GPT-3.5. With GPT-4o, there's no excuse for uninformed takes flooding social media and YouTube. OpenAI's terrible ChatGPT UI and closed-source GPT store are still problematic. <a href="https://docs.anthropic.com/en/prompt-library/library">At least Anthropic is trying to teach better prompt writing</a> (with more effort than OpenAI).</p><p>But back to my rant...</p><p>Many users do not try to tune the system, ask poorly written questions, get unsatisfactory results, and then complain online. We're 2 years into this cycle - it's time to wise up.</p><p>By now, we should all know that these systems are few-shot predictive models that respect the chain of command. Yet users provide neither the few-shot examples nor the command it should follow. It's no wonder ChatGPT-written resumes and landing pages sound like professional BS with poor English.</p><h4>To get good results, you need to:</h4><ol><li><p>Give it examples</p></li><li><p>Understand the chain of command</p></li></ol><h4>GPTs can handle Zero-shot and Few-shot (but not really)</h4><ul><li><p>Zero-shot learning: performing new tasks using broad pre-training</p></li><li><p>One-shot and few-shot learning: generalizing from 1-to-many examples to perform a task on new inputs</p></li></ul><p>While GPT should handle both, its zero-shot capabilities aren't there yet beyond toying around. Always provide 2-10 examples of what you want - otherwise, you'll get predictions based on general knowledge, which means bad results.</p><h3>The GPT Chain of Command</h3><ol><li><p>OpenAI and other providers set the basic rules and capabilities.</p></li><li><p>Developers can customize these settings for specific tasks.</p></li><li><p>End-user requests go through the customized settings.</p></li></ol><p>For non-developers, the best way to leverage the chain of command is to create a custom GPT with a system message specifying your requirements, along with necessary examples.</p><p>If you're coding against the APIs directly, you have more options beyond just system messages, such as fine-tuning. However, in the context of ChatGPT, system messages are the primary tool available (function calls and tags can be discussed in another post).</p><p>Tip: You can use Anthropic's prompt generator to create the initial system message and examples, then paste it into ChatGPT's custom GPT (Anthropic's UI is too bad to use directly): https://x.com/danielvlopes/status/1792978707201831155</p><p>Here's a simplified version of what I use for my Blog Posts custom GPT in ChatGPT:</p><pre><code># Identity 
You are a draft polisher assistant. Your job is to receive long-form poorly formatted drafts and convert them to a final version maintaining the tone, and vocabulary but using fewer words, fixing the grammar, adding more spacing, better structure, and making things clearer.

# Steps to take

To do this, first carefully read through the entire blog post to understand its structure and key points. Then, go through and rewrite the polished version of each section while keeping the same titles and technical insights.

Here are some tips:

- Remove unnecessarily complex language. Replace these with plain, straightforward terms. For example:
-- Instead of "leverage", use "use"  
-- Instead of "synergize", explain what you actually mean, like "work together"
-- Instead of "disintermediation", just explain the concept in simple terms
- Break up long, run-on sentences into shorter, clearer ones
- Use contractions like "it's" and "you'll" to make it sound more natural and conversational 
- Identify the technical parts of the text and don't remove these
- Keep the tone straightforward and professional, avoiding overly casual language but ensuring it doesn't become too formal.
- Keep the personal touch and original intent of the content.
- Format the condensed version of the post with clear section breaks and spacing for readability.

Here's a list of banned words to never ever use any of the following words (or similar words): "delve", "embarked", "journey", "thrilled", "aim", "tailored", "fledgling", "shrouded", "endeavor" , "quantifiable", "empirical", "whereabouts", "debut", "keen", "craft", "tailor", "elevate", "ignite", "empower", "unleash", "horizon", "harness", "diving into", "interconnect", "enhance".

As you write the new version, make sure to maintain the same voice, tone, vocabulary, and style as the original blog post. The goal is not to rewrite it completely, but to have a slight more polished version that is easily digestible format for people short on time.

# Example 1

Input: 
```
Original draft
```

Output: 
```
Well written version
```

# Example 2:

Input: 
```
Original draft
```

Output: 
```
Well written version
```

# Example 3:

Input: 
```
Original draft
```

Output: 
```
Well written version
```</code></pre><p>Put in some effort; otherwise, you are to blame for the bad results. With early-stage technology, we can't have the luxury of expecting things to work perfectly right out of the box.</p><p></p>]]></content:encoded></item><item><title><![CDATA[W2 Recap: 4o, TailwindUI, lot of code]]></title><description><![CDATA[Here's what I accomplished in my 2nd week of my 3-month build & learn cycle]]></description><link>https://journal.daniellopes.dev/p/week-2-recap-4o-tailwindui-lot-of</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/week-2-recap-4o-tailwindui-lot-of</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Sat, 18 May 2024 19:05:20 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!BHAg!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!BHAg!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!BHAg!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!BHAg!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!BHAg!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!BHAg!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!BHAg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2034800,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!BHAg!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!BHAg!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!BHAg!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!BHAg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F316055e5-eb03-4b72-a5ac-03d9b1d89595_1456x832.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In May I left my <a href="https://journal.daniellopes.dev/p/end-of-a-7-year-journey">full-time job</a>  to start a 3-month LLM learning cycle by building a few products of my own. I&#8217;m documenting my journey here. Every week I post a recap of how the week went, things I learned, etc.</p><h3>Idea selection</h3><p>As <a href="https://journal.daniellopes.dev/i/144571357/why-now">I said before</a>, I'm currently focused on small indie-hacker-scoped projects. What does that mean? </p><p>I'm sorting ideas by level of scope and potential size. For the first projects, I'm avoiding super ambitious things, they need large surface area and tend to have a lot of competition. In the middle, there are &#8220;during gold rush, sell shovels&#8221; type of ideas, like AI infrastructure tools for developers where the need is clear, but so many teams are rushing to build them. </p><p>I'm looking for small things or those with seemingly bad business models that I want to use myself. When picking a market, you either solve something for yourself or others. For now, I'm choosing things I want for myself to save time on initial customer research. </p><p>However, I still want something with clusters of people easily found online so marketing isn&#8217;t an uphill battle.</p><p>To recap: I need something not hard to market with a bad <a href="https://en.wikipedia.org/wiki/Total_addressable_market">TAM</a>, small in scope, and that I want for myself (likely small apps, wrappers, or add-ons for existing players). </p><p><strong>Perfect recipe for disaster &#128517; Why?</strong> </p><ol><li><p>It gives me time to focus on adjacent things I'll reuse later, like a clean boilerplate with UI libraries, authentication, billing, and API clients.  </p></li><li><p>In this first month, I'm basically starting easy to rest a bit while staying creative, studying, and even having a chance to build some baby MRR (but very unlikely). </p></li></ol><p>I've narrowed my list down to 5 potential ideas for now, and I&#8217;m already working on a side project that I&#8217;ve had started in the past. These ideas will be for the next project.</p><h1>Week recap: progress!</h1><h3>Boilerplate</h3><h4>Rails &#128642;</h4><p>I'm sticking with Rails - it's stable, battle-tested, and I know it well. Not interested in Next.js yet, maybe just for marketing later.</p><p>Going with the vanilla Rails stack for now, using Hotwire instead of React for the front end. Everything has a learning curve, and modern-day React is one I don't want to tackle now. Might use React later for more interactive ideas.</p><h4>Jumpstart Pro &#128640;</h4><p>Didn't want to waste time setting up billing (even with Stripe), accounts, authentication, 2FA, or even little stuff like pagination. Went with <a href="https://jumpstartrails.com">Jumpstart Pro</a> - it's a good starting point with high-quality code that's easy to customize, even if I'd prefer a smaller and with less dependencies. JSP also has a lot of new things that I wasn&#8217;t familiar with like <a href="https://code.visualstudio.com/docs/devcontainers/containers">Dev Containers</a>, so fun learning experience.</p><p></p><h4>TailwindCSS + TailwindUI (Pros &amp; Cons) &#127788;&#65039;</h4><p>Switched from theming Bootstrap (like I did for Canopy) to TailwindUI. Not having CSS files makes it perfect for code generators like custom ChatGPT and V0, which are great at generating Tailwind markup. TailwindUI is also very versatile and covers a lot, so I get a productive boost from the (copy, paste, and customize).</p><p>However, using TailwindUI with vanilla Rails hasn't been smooth. I find myself coding basic things because TailwindUI has no vanilla JS option for behaviors like dropdowns, carousels, and off-canvas panels. Jumpstart's Stimulus helps with some of that. Another issue is that TailwindUI's markup was meant for React components, so you often need to abstract classes into Rails helpers for repetitive use.</p><p>This week: 2 days spent to cover TailwindUI basics, 1 day to learn Jumpstart, and the learning curve of TailwindCSS every day. But now feeling quite productive already.</p><h3>Brainstorming with friends</h3><p>I have a lot of people reaching out to network and brainstorm ideas. Last week I spent about &amp; hours on this.</p><h4>Canopy: GPT-4o</h4><p>I work 8 hours a week at Canopy, doing 1-2 hours at a time. After OpenAI launched GPT-4o, I migrated our AI Assistant:</p><ul><li><p>It's much <strong>cheaper and faster</strong> (so fast that streaming isn&#8217;t necessary anymore)</p></li><li><p>Overall <strong>better results</strong>: I tested using our rubric of 200 questions (not a formal eval system yet). After migration, the results were mostly better.</p></li></ul><p><strong>Two small issues:</strong></p><ol><li><p><strong>Acronyms:</strong> It made up some acronyms, which I fixed with prompt engineering.</p></li><li><p><strong>JSON</strong>: We use OpenAI's Assistant API with their RAG tool, so we can't use "JSON MODE". I ask for JSON via prompt engineering. It tried to add a wrapper around the markdown to avoid breaking the JSON, which we fixed by adjusting the code.</p></li></ol><p>The <strong>speed and price drop let us split our 200+ line prompt into two prompts</strong>, running the query sequentially:</p><ol><li><p>Ask for the answer, present it to the user</p></li><li><p>Ask for additional metadata async and refresh that part of the screen (WIP, shipping next week)</p></li></ol><p>The cost drop opens up the opportunity to use GPT-4 level of IQ for everything, including B2C customers with low subscription prices.</p><h4>Product 1: A Lot of Work Done</h4><p>As I said before, I'm already working on my first product, which I've been building since early this year a few hours a week. It's a small thing, but a useful tool for myself that benefits from LLM support.</p><p>I spent 3 days (30 hours) migrating the code to Jumpstart Pro and 4o, and adding features for launch - shooting for next weekend. I also worked on the marketing landing page.</p><p><strong>Still to do: </strong>Complete an &#8220;insights&#8221;  feature. Add an LLM-powered overview screen. TailwindUI is helping a lot with the design. So, I&#8217;m estimating ~3 to 5 more days before able to onboard users.</p><p>I&#8217;ll need to setup new servers due to significant changes from Jumpstart Pro. Setting up servers again will take likely at most one day.</p><p>Goal:  For this week or next, put the product up + add a marketing landing with a waitlist, and send it to the communities I&#8217;m a member of.</p><h2>The bad things</h2><ol><li><p><strong>Poor sleep hygiene:</strong> Worked too close to bed, messing up my sleep. Getting older I need to be a lot smarter about sleep hygiene these days or the price is real.</p></li><li><p><strong>Not enough exercise:</strong> Only exercised 3 times: 2x 40min half-assed weight sessions, 45min hard VO2 max virtual cycling race on Zwift. Usually do 2x that, it's important for my mental health.</p></li><li><p><strong>Missed meet-ups:</strong> Trying to go to more. Had two interesting ones this week, an Agent Guidance and a Gemini vs 4o Hackathon on Saturday, and skipped both.</p></li></ol><h2>Next week</h2><p>I'm off for 3 days on a last-minute camping trip, so this week will be short. Working this weekend to account for it. Most of my time will be split between working on Product 1 and some LLM work for Canopy.</p><p></p>]]></content:encoded></item><item><title><![CDATA[Week 1 Recap: Ideas, Boilerplate, Monitoring & (not) Shipping]]></title><description><![CDATA[Here's what I accomplished in my first week of my 3-month build & learn cycle]]></description><link>https://journal.daniellopes.dev/p/week-1-recap-ideas-boilerplate-monitoring</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/week-1-recap-ideas-boilerplate-monitoring</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Mon, 13 May 2024 07:12:38 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!B-8b!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!B-8b!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!B-8b!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!B-8b!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!B-8b!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!B-8b!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bf521100-dc78-4e81-b853-54e315e1c866_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1791418,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!B-8b!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!B-8b!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!B-8b!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!B-8b!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf521100-dc78-4e81-b853-54e315e1c866_1456x832.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I've been toying with LLMs since end of 2022, trying to create an AI Assistant for my company, <a href="https://canopy.is/">Canopy</a>. The tech wasn&#8217;t quite there until GPT-4 came along. Last week, after launching our AI Assistant in alpha, I decided to leave <a href="https://journal.daniellopes.dev/p/end-of-a-7-year-journey">my job to focus on developing my own ideas</a>. I&#8217;ll be documenting my journey here as I go, and this is the recap of my first week.</p><h4>Why now?</h4><p>With the last couple of years' progress, so many of the traditional ML projects have been (or could be) replaced by language models, making things easier compared to just three years ago. Things like unstructured data, classification, sentiment analysis, NLP, speech, and much more are now all manageable by one person or a tiny team.</p><p>I think there are four ways to benefit from what is happening:</p><ol><li><p>Big Tech working on foundational infrastructure or large problems.</p></li><li><p>Moonshot projects challenging existing players like Perplexity vs Google.</p></li><li><p>Vertical applications (or what people are now calling "Applied AI").</p></li><li><p>Indie hackers focused on things that don't look venture-scale at first glance.</p></li></ol><p>For now, for these next three months, I'm personally interested in item 4. Maybe later I'll try to join a team working on the other ones.</p><h4>Setting a 3-month deadline</h4><p>Having a short deadline helps keep me motivated and accountable. It pushes me to focus on practical, achievable goals. </p><p>Since I work in product and not ML or research, I'm interested in end-use application. There's a ton of effort involved in making LLMs (unstable, unreliable &amp; expensive tech) work smoothly in a finished product. </p><p>For instance, if I were only studying, factors like accuracy, cost, or user experience would be glossed over. But if I need to ship and charge for something, that&#8217;s the opposite.</p><p>Three months, with my skill set, I think is a doable but still challenging timeline to ship at least two products (my stretch goal is six).</p><h1>Now, to the first-week recap</h1><p><strong>I had 5 goals for this past week:</strong></p><ol><li><p>Ship my personal site, start a Substack, clean up social media. </p></li><li><p>Get a boilerplate codebase for my projects (sign-up, subscription, etc).</p></li><li><p>Organize and research my ideas: pick 3 to 6. </p></li><li><p>Canopy: monitor the Alpha launch of our AI Assistant.</p></li><li><p>Bonus: Ship a WIP project that I had already.</p></li></ol><h3>&#9989; Goal 1: Personal site, Substack &amp; social media</h3><p><strong>Personal site:</strong> Over the weekend, I started working on a personal site that I&#8217;d need to send to some people. I also used it to play with Next, Vite, Astro, Tailwind, and Shadcn. All things not part of Canopy stack. </p><p>I went with Next + TailwindUI. Site is live here: <a href="https://daniellopes.dev">https://daniellopes.dev</a></p><p>For my products will be using just Tailwind and my usual Rails.</p><p><strong>Substack:</strong> <a href="https://notes.andymatuschak.org/zCMhncA1iSE74MKKYQS5PBZ">Working with the garage door open is something</a> I always wanted to do for <a href="https://canopy.is/">Canopy</a>, but I left that for my co-founder. So, last week I started this, and I posted the first post: <a href="https://journal.daniellopes.dev/p/parsing-7-years-of-stand-up-updates">Using LLMs to parse 7 years of stand-up updates</a> (which was an exercise in trying different summarization approaches as well). </p><p>Writing a ton is also part of one of my ideas that I might work on next month.</p><p><strong>Social media: </strong>To stay more informed about things happening in San Francisco (meetups, hackathons, etc). So spent a few hours making list of people I want to follow more closely. Social media isn't all just noise; it can be valuable.</p><h3>&#129000; Goal 2: Organize all my ideas</h3><p>I made a lot of progress on this goal, but not enough to confidently have three solid ones. I'll keep working on this next week. Here's the gist:</p><p>I keep a list of ideas, logging anything that comes to mind (features or products). I always have anywhere around 100 to 200 items.</p><p>This week, primary goal was to organize my list, considering these 4 criteria:</p><ul><li><p><strong>Is it a newly viable thing?</strong> And is it flooded with people doing good work already? (Areas like Knowledge Base for Companies or Customer Support are already saturated.)<br></p></li><li><p><strong>Do I want this for myself? </strong>Building something for others demands a lot of research, which isn't feasible within a three-month timeline.<br></p></li><li><p><strong>Does it have commercial potential?</strong></p><ul><li><p><strong>If no commercial potential</strong>, the aim should be for learning or personal use; don't waste time on unviable or <a href="https://matt-rickard.com/tarpit-ideas">tarpit</a> ideas. Do just enough to meet the goal and move on. At the end of the day, I want people to use the things I build.</p></li><li><p><strong>If there's commercial potential, </strong>do I have a unique perspective on solving the challenges? Then do the usual analysis&#8212;pricing, customer base needed for a MRR, competitors, etc.<br></p></li></ul></li><li><p>Does it have the <strong>potential to find a single feature</strong> that could be a lot better than existing methods? And can it be developed as an add-on for existing players or does it need to be a replacement? (replacements are less interesting with this timeline).<br></p></li><li><p><strong>What&#8217;s the marketing effort? Can I market it alone?</strong> If it's B2B, can I identify a sustainable group of people to target? If it's B2C, can I come up with marketing strategies that don't rely on ads or require excessive time?</p></li></ul><p>Many ideas I had on the list from last year had no credible competitors six months ago, but now have promising ones. So, I ditched those. While I don't mind competition when working with more resources, working solo, I don&#8217;t need my back against the wall all the time.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!l4d7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!l4d7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png 424w, https://substackcdn.com/image/fetch/$s_!l4d7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png 848w, https://substackcdn.com/image/fetch/$s_!l4d7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png 1272w, https://substackcdn.com/image/fetch/$s_!l4d7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!l4d7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png" width="1456" height="1044" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1044,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:385072,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!l4d7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png 424w, https://substackcdn.com/image/fetch/$s_!l4d7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png 848w, https://substackcdn.com/image/fetch/$s_!l4d7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png 1272w, https://substackcdn.com/image/fetch/$s_!l4d7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fccb3aa00-ff5b-4629-a51c-28acab3070e0_3662x2626.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">I&#8217;m organizing all the ideas with Obsidian Kanban plugin, and the things in purple have some research around.</figcaption></figure></div><h3>&#9989; Goal 3: Boilerplate</h3><p>Initially, I planned to use much of Canopy's codebase for non-proprietary elements and create a standard boilerplate things like sign-up, subscriptions, and admin (which would let me contribute back improvements). </p><p>However, given my time constraints, I'm reluctant to spend excessive time cleaning things like our complex billing system that covers B2B, B2C, and enterprise solutions.</p><p>I've decided to go with <a href="https://jumpstartrails.com">Jumpstart Pro</a>. There are several things I plan to remove/change, but for now, it's enough. I&#8217;ve also wanted to learn Tailwind for a while, so it was nice that Jumpstart Pro came with Tailwind already.</p><p>I&#8217;m also sticking to Rails (<a href="https://world.hey.com/dhh/the-one-person-framework-711e6318">The One Person Framework</a> &#8482;&#65039;). </p><p>I&#8217;ve been tempted to make the jump to Python because of how readily available tooling is for their ecosystem, but the fastest way to ship nothing is to pick a bunch of things you are not expert on. It&#8217;s too easy to underestimate all the ecosystem details that you need to know in order to ship a product that is stable enough to require nearly no-maintenance.</p><p>In this case, my goal is to focus on the behavior of LLMs (different models, prompt architecture, cost, reliability, etc.) and not mess around with different frontend or backend technologies.</p><h3>&#9989; Goal 4: Canopy - Monitor the Alpha of AI Assistant</h3><p>We launched our AI Assistant in Alpha last week. So, I focused on getting visibility for how the system is performing. We haven't fully committed to an evaluation strategy yet. For now, we managed to see how satisfied people are, along with some basic analytics and reporting on content that we didn't have for <a href="https://blogs.nvidia.com/blog/what-is-retrieval-augmented-generation/">RAG</a>.</p><p>For &#8220;eval&#8221; this looks very promising: <a href="https://athina.ai">https://athina.ai</a></p><p>I&#8217;ve seen their product at a meet-up in SF last week and found it a better fit for our needs than other tools I'd considered (open source solutions like <a href="https://github.com/openai/evals">OpenAI</a>&#8217;s or <a href="https://ragas.io">Ragas</a>.)</p><h3>&#10060; Goal 5: Ship a WIP product</h3><p>I didn't succeed with this one. </p><p>Last year, I started working on a product that solves a personal use. None of the business calculations add up (cost per person vs what I can charge, TAM, competitors, etc.), but it meets my criteria of being something I want to use and something I can learn from. </p><p>I've already learned a lot from this project. It's what I worked on over the weekends last year to learn different models, prompting or RAG architecture without getting burnt out from just working at Canopy. It's ~80% done, but now I just need the last stretch to make it possible for others to sign up &amp; buy it so I'm not paying for other people's API calls. </p><p>I'll probably need another three days to get it ready for other users. Since I chose Jumpstart Pro, I'm going to migrate the product to it and see how that goes. So, this will carry over into Week 2.</p><h2>Plan for Week 2 </h2><p>Switching jobs, picking ideas, setting up the basics, and wrapping up existing projects is kind of like tying up loose ends. It's not the most thrilling, but it's still important.</p><p>For next week, I have 3 main goals: Launch (1) and market (2) the project I just mentioned, and to do more research on my ideas (3).</p>]]></content:encoded></item><item><title><![CDATA[Using ChatGPT & Custom GPTs for code brainstorming]]></title><description><![CDATA[As part of my attempt to get better at prompt engineering I'm really trying to do as much as possible of my work with some AI support (coding, design, reviews, copy, etc.]]></description><link>https://journal.daniellopes.dev/p/using-chatgpt-custom-gpts-for-code</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/using-chatgpt-custom-gpts-for-code</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Fri, 10 May 2024 19:06:15 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!w3C0!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>As part of my attempt to get better at prompt engineering I'm really trying to do as much as possible of my work with some AI support (coding, design, reviews, copy, etc. use it for everything first).</p><p>Custom GPTs is the feature from ChatGPT that makes me use it more than anything else (more than Claude, Perplexity, or Gemini). I have a handful Custom GPTs that I use every day, and I create a bunch for specific needs as well. </p><p>One of my favorite use cases for one of my recurring Custom GPTs is to brainstorm different solutions before I start coding (this one is using a <a href="https://chatgpt.com/g/g-ZoyP3Ze9L-ruby-rails-helper">Rails-specific GPT</a> one). I pretty much will start planning my work there. It will often give me good suggestions or at least provide me with some good naming for things. Then at the end a finish by asking for a review as well.</p><p>In this example, I know what different patterns I might want to use here. I could do it myself no problem, but I find it great to be able to see a draft version of all the different ideas in 5 seconds first and then decide whether to try to do it in my head, picking just one and spending 20 minutes executing on it.</p><h1>The request</h1><pre><code>I need to reuse these lines on a different controller. What would you suggest? Maybe should I extract a service or PORO class? or a concern for controllers?

@learning_path = current_person.personal_learning_path
@learning_path_modules = @learning_path.training_modules.includes(:lessons) if @learning_path
@learning_path_progress = @learning_path.progress_for(current_person) if @learning_path
@training_modules = load_training_modules

---

def load_training_modules
  if current_organization_training_subscription&amp;.trial?
    TrainingModule.standard_order_for_trial
  else
    TrainingModule.preferred_order_for(
      current_person,
      order_by_preference: "recommendations",
      exclude_seen_modules: false,
      consider_watched_lessons: true,
      alternate_results: false
    )
  end
end

====HERES THE FULL CONTROLLER=====

class TrainingModulesController &lt; TrainingBaseController
  menu_item :training_modules
  before_action :initial_breadcrumbs

  def index
    @assessment = current_person.last_assessment_with_self_response_complete
    current_person.generate_personal_learning_path_if_needed!

    @learning_path = current_person.personal_learning_path
    @learning_path_modules = @learning_path.training_modules.includes(:lessons) if @learning_path
    @learning_path_progress = @learning_path.progress_for(current_person) if @learning_path

    @training_modules = load_training_modules
    @display_tour = !current_navigation_preference.learning_paths_tour_closed?
  end

  def show
    @training_module = TrainingModule.includes(:lessons).find(params[:id])
    @progress = TrainingModuleProgress.new(current_person, @training_module)
    @lesson = @progress.next_available_lesson || @training_module.lessons.first

    current_person.training_history_entries.track(request.url, @training_module)

    redirect_to training_module_lesson_path(@training_module, @lesson)
  end

  private

  def initial_breadcrumbs
    breadcrumb.add "Modules", training_modules_path
  end

  def load_training_modules
    if current_organization_training_subscription&amp;.trial?
      TrainingModule.standard_order_for_trial
    else
      TrainingModule.preferred_order_for(
        current_person,
        order_by_preference: "recommendations",
        exclude_seen_modules: false,
        consider_watched_lessons: true,
        alternate_results: false
      )
    end
  end
end</code></pre><h1>GPT Response</h1><p>Given the repetitive use of the code involved in fetching and preparing `@learning_path` and related entities across different controller actions, extracting these operations into a reusable component is a good idea. This will help maintain DRY (Don't Repeat Yourself) principles and ensure cleaner and more maintainable code. Here are a few options to consider:</p><p>### 1. Service Object</p><p>A service object is ideal for encapsulating business logic that doesn't naturally fit within a model or a controller. Since your code involves preparing data for views based on user-specific and contextual business rules, a service object could be a good fit.</p><pre><code># app/services/learning_path_service.rb
class LearningPathService
  def initialize(person)
    @person = person
  end

  def fetch_data
    learning_path = @person.personal_learning_path
    {
      learning_path: learning_path,
      learning_path_modules: learning_path&amp;.training_modules&amp;.includes(:lessons),
      learning_path_progress: learning_path&amp;.progress_for(@person),
      training_modules: load_training_modules
    }
  end

  private

  def load_training_modules
    if @person.current_organization_training_subscription&amp;.trial?
      TrainingModule.standard_order_for_trial
    else
      TrainingModule.preferred_order_for(
        @person,
        order_by_preference: "recommendations",
        exclude_seen_modules: false,
        consider_watched_lessons: true,
        alternate_results: false
      )
    end
  end
end</code></pre><p>Usage in a controller:</p><pre><code>def index
  service = LearningPathService.new(current_person)
  data = service.fetch_data
  @learning_path, @learning_path_modules, @learning_path_progress, @training_modules = data.values_at(:learning_path, :learning_path_modules, :learning_path_progress, :training_modules)
  @display_tour = !current_navigation_preference.learning_paths_tour_closed?
end</code></pre><p>### 2. Concern</p><p>If the logic is tightly coupled with ActiveRecord objects or used across multiple controllers, creating a concern might be a better approach. This keeps controller-specific logic within the context of controllers.</p><pre><code># app/controllers/concerns/learning_path_loadable.rb

module LearningPathLoadable
  extend ActiveSupport::Concern

  def load_learning_path_data
    @learning_path = current_person.personal_learning_path
    @learning_path_modules = @learning_path&amp;.training_modules&amp;.includes(:lessons)
    @learning_path_progress = @learning_path&amp;.progress_for(current_person)
    @training_modules = load_training_modules
  end

  private

  def load_training_modules
    if current_organization_training_subscription&amp;.trial?
      TrainingModule.standard_order_for_trial
    else
      TrainingModule.preferred_order_for(
        current_person,
        order_by_preference: "recommendations",
        exclude_seen_modules: false,
        consider_watched_lessons: true,
        alternate_results: false
      )
    end
  end
end</code></pre><p>Usage in a controller:</p><pre><code>class TrainingModulesController &lt; TrainingBaseController
  include LearningPathLoadable

  def index
    load_learning_path_data
    @display_tour = !current_navigation_preference.learning_paths_tour_closed?
  end
end</code></pre><p>### Decision</p><p>The choice between a service object and a concern depends on how closely the logic is tied to the controller's responsibilities. If the logic is more about data manipulation and less about HTTP-specific operations, a service object is more suitable. If it's about setting up controller instance variables and integrates closely with controller lifecycle (like before actions), a concern might work better. In your case, either approach would streamline your code and enhance maintainability, with a slight preference for a service object given the nature of the logic involved.</p><h1>Final decision</h1><p>I ended up with the concern approach with a code that is very similar to the original GPT suggestion.</p><h1>Custom GPT &#8220;source code&#8221;</h1><p>This custom GPT is the one I use for <a href="https://canopy.is">Canopy</a> setup for the following use cases (and a has few examples of our codebase inside):</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!w3C0!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!w3C0!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png 424w, https://substackcdn.com/image/fetch/$s_!w3C0!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png 848w, https://substackcdn.com/image/fetch/$s_!w3C0!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png 1272w, https://substackcdn.com/image/fetch/$s_!w3C0!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!w3C0!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png" width="1456" height="860" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:860,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:4267008,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!w3C0!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png 424w, https://substackcdn.com/image/fetch/$s_!w3C0!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png 848w, https://substackcdn.com/image/fetch/$s_!w3C0!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png 1272w, https://substackcdn.com/image/fetch/$s_!w3C0!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7763b1cc-1e8b-4e99-baa1-ac9ef8191a1c_4224x2494.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="github-gist" data-attrs="{&quot;innerHTML&quot;:&quot;<div id=\&quot;gist130179441\&quot; class=\&quot;gist\&quot;>\n    <div class=\&quot;gist-file\&quot; translate=\&quot;no\&quot; data-color-mode=\&quot;light\&quot; data-light-theme=\&quot;light\&quot;>\n      <div class=\&quot;gist-data\&quot;>\n        <div class=\&quot;js-gist-file-update-container js-task-list-container\&quot;>\n  <div id=\&quot;file-rails_gpt-txt\&quot; class=\&quot;file my-2\&quot;>\n    \n    <div itemprop=\&quot;text\&quot; class=\&quot;Box-body p-0 blob-wrapper data type-text  \&quot;>\n\n        \n<div class=\&quot;js-check-bidi js-blob-code-container blob-code-content\&quot;>\n\n  <template class=\&quot;js-file-alert-template\&quot;>\n  <div data-view-component=\&quot;true\&quot; class=\&quot;flash flash-warn flash-full d-flex flex-items-center\&quot;>\n  <svg aria-hidden=\&quot;true\&quot; height=\&quot;16\&quot; viewBox=\&quot;0 0 16 16\&quot; version=\&quot;1.1\&quot; width=\&quot;16\&quot; data-view-component=\&quot;true\&quot; class=\&quot;octicon octicon-alert\&quot;>\n    <path d=\&quot;M6.457 1.047c.659-1.234 2.427-1.234 3.086 0l6.082 11.378A1.75 1.75 0 0 1 14.082 15H1.918a1.75 1.75 0 0 1-1.543-2.575Zm1.763.707a.25.25 0 0 0-.44 0L1.698 13.132a.25.25 0 0 0 .22.368h12.164a.25.25 0 0 0 .22-.368Zm.53 3.996v2.5a.75.75 0 0 1-1.5 0v-2.5a.75.75 0 0 1 1.5 0ZM9 11a1 1 0 1 1-2 0 1 1 0 0 1 2 0Z\&quot;></path>\n</svg>\n    <span>\n      This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.\n      <a class=\&quot;Link--inTextBlock\&quot; href=\&quot;https://github.co/hiddenchars\&quot; target=\&quot;_blank\&quot;>Learn more about bidirectional Unicode characters</a>\n    </span>\n\n\n  <div data-view-component=\&quot;true\&quot; class=\&quot;flash-action\&quot;>        <a href=\&quot;{{ revealButtonHref }}\&quot; data-view-component=\&quot;true\&quot; class=\&quot;btn-sm btn\&quot;>    Show hidden characters\n</a>\n</div>\n</div></template>\n<template class=\&quot;js-line-alert-template\&quot;>\n  <span aria-label=\&quot;This line has hidden Unicode characters\&quot; data-view-component=\&quot;true\&quot; class=\&quot;line-alert tooltipped tooltipped-e\&quot;>\n    <svg aria-hidden=\&quot;true\&quot; height=\&quot;16\&quot; viewBox=\&quot;0 0 16 16\&quot; version=\&quot;1.1\&quot; width=\&quot;16\&quot; data-view-component=\&quot;true\&quot; class=\&quot;octicon octicon-alert\&quot;>\n    <path d=\&quot;M6.457 1.047c.659-1.234 2.427-1.234 3.086 0l6.082 11.378A1.75 1.75 0 0 1 14.082 15H1.918a1.75 1.75 0 0 1-1.543-2.575Zm1.763.707a.25.25 0 0 0-.44 0L1.698 13.132a.25.25 0 0 0 .22.368h12.164a.25.25 0 0 0 .22-.368Zm.53 3.996v2.5a.75.75 0 0 1-1.5 0v-2.5a.75.75 0 0 1 1.5 0ZM9 11a1 1 0 1 1-2 0 1 1 0 0 1 2 0Z\&quot;></path>\n</svg>\n</span></template>\n\n  <table data-hpc class=\&quot;highlight tab-size js-file-line-container js-code-nav-container js-tagsearch-file\&quot; data-tab-size=\&quot;8\&quot; data-paste-markdown-skip data-tagsearch-lang=\&quot;Text\&quot; data-tagsearch-path=\&quot;rails_gpt.txt\&quot;>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L1\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;1\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC1\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;># IDENTITY and PURPOSE</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L2\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;2\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC2\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>You are Ruby on Rails programming assistant specialized in Ruby 3, Rails 7, Postgres Stimulus.js, Bootstrap 5, ERB, and Minitest. </td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L3\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;3\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC3\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L4\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;4\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC4\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;># STEPS TO FOLLOW</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L5\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;5\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC5\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L6\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;6\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC6\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>1. You have 5 modes, and you adjust your behavior based on each:</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L7\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;7\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC7\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>1.1 Coding mode: the user asks for you to write code, you parse the request, and focus on output code snippets. DONT EXPLAIN anything, just write the code.</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L8\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;8\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC8\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>1.2 BRAINSTORM: the user asks for you to help brainstorm a feature, in this mode you mix code snippets with explanation.</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L9\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;9\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC9\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>1.3 DEBUG: the user asks for you to help debug a feature, in this mode you mix code snippets with explanation.</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L10\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;10\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC10\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>1.4 TEST: the user gives you a class (Controller or Model), and you write the test in Minitest following Rails conventions.</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L11\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;11\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC11\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>1.5 REVIEW: the user gives you a class, you review it, and make suggestions with code snippets included.</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L12\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;12\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC12\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L13\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;13\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC13\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>REALLY IMPORTANT: in Coding &amp;amp; Testing mode, you really avoid explaining anything, you just write code. In Brainstorming &amp;amp; Debugging mode, you mix code snippets with explanation but you avoid writing long paragraphs.</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L14\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;14\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC14\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L15\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;15\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC15\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>## TONE AND STYLE</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L16\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;16\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC16\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L17\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;17\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC17\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>You adhere to Rails naming conventions for classes, methods, and more as in you follow snake case for files and methods and camelcase for Classes. You also follow the Rails style guide for code formatting. You follow Rubocop rules.</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L18\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;18\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC18\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L19\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;19\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC19\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>## STYLE OF CODE YOU SHOULD WRITE</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L20\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;20\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC20\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L21\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;21\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC21\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>Example: Stimulus Controller (make sure to avoid addEventListeners, instead favor interactions via HTML actions, values, and targets)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L22\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;22\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC22\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L23\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;23\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC23\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>import { Controller } from &amp;quot;@hotwired/stimulus&amp;quot;;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L24\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;24\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC24\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>import { install } from &amp;quot;@github/hotkey&amp;quot;;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L25\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;25\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC25\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L26\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;26\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC26\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>export default class extends Controller {</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L27\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;27\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC27\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  connect() {</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L28\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;28\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC28\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    this.installHotkeys();</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L29\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;29\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC29\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  }</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L30\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;30\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC30\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L31\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;31\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC31\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  disconnect() {</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L32\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;32\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC32\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    this.uninstallHotkeys();</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L33\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;33\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC33\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  }</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L34\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;34\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC34\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L35\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;35\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC35\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  installHotkeys() {</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L36\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;36\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC36\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    for (const el of this.element.querySelectorAll(&amp;quot;[data-hotkey]&amp;quot;)) {</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L37\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;37\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC37\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      install(el);</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L38\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;38\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC38\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    }</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L39\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;39\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC39\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  }</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L40\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;40\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC40\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L41\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;41\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC41\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  uninstallHotkeys() {</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L42\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;42\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC42\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    for (const el of this.element.querySelectorAll(&amp;quot;[data-hotkey]&amp;quot;)) {</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L43\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;43\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC43\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      el.hotkey?.destroy(); // Only works if hotkey exposes a destroy/uninstall method</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L44\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;44\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC44\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    }</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L45\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;45\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC45\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  }</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L46\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;46\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC46\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>}</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L47\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;47\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC47\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L48\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;48\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC48\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>Example: Rails Controller</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L49\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;49\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC49\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L50\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;50\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC50\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>class AssessmentsController &amp;lt; TrainingBaseController</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L51\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;51\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC51\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  before_action :disable_training_search</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L52\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;52\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC52\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  before_action :find_in_progress_or_start_new, only: %i[new create new_via_signup]</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L53\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;53\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC53\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L54\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;54\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC54\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  def new_via_signup</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L55\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;55\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC55\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    if @assessment.persisted?</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L56\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;56\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC56\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      @response = find_or_create_response_for(@assessment)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L57\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;57\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC57\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L58\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;58\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC58\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      if @response.complete?</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L59\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;59\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC59\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>        redirect_to training_home_path, status: :see_other</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L60\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;60\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC60\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L61\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;61\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC61\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L62\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;62\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC62\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L63\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;63\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC63\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L64\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;64\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC64\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  def new</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L65\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;65\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC65\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    if @assessment.persisted?</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L66\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;66\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC66\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      @response = find_or_create_response_for(@assessment)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L67\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;67\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC67\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L68\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;68\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC68\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      if @response.complete?</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L69\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;69\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC69\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>        redirect_to assessment_results_path(@assessment), status: :see_other</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L70\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;70\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC70\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L71\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;71\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC71\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    else</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L72\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;72\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC72\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      @last_assessment = current_person.assessments.order(created_at: :desc).first</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L73\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;73\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC73\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L74\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;74\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC74\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L75\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;75\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC75\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L76\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;76\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC76\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  def create</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L77\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;77\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC77\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    @assessment.survey = current_organization.assessment_surveys.first!</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L78\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;78\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC78\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    @assessment.onboarding_assessment = params[:assessment]&amp;amp;.fetch(:onboarding_assessment).present?</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L79\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;79\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC79\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L80\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;80\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC80\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    ActiveRecord::Base.transaction do</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L81\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;81\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC81\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      @assessment.save!</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L82\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;82\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC82\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      @response = find_or_create_response_for(@assessment)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L83\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;83\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC83\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      track(&amp;quot;assessment_created&amp;quot;,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L84\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;84\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC84\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>            assessment_id: @assessment.id,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L85\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;85\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC85\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>            onboarding_assessment: @assessment.onboarding_assessment)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L86\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;86\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC86\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L87\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;87\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC87\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L88\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;88\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC88\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    redirect_to next_assessment_response_answers_path(@response), status: :see_other</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L89\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;89\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC89\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L90\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;90\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC90\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L91\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;91\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC91\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  private</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L92\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;92\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC92\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L93\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;93\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC93\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  def find_or_create_response_for(assessment)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L94\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;94\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC94\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    response = assessment.responses.find_by(author: current_person)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L95\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;95\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC95\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    response ||= assessment.responses.incomplete.find_or_create_by!(author: current_person)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L96\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;96\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC96\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L97\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;97\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC97\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    response</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L98\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;98\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC98\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L99\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;99\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC99\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L100\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;100\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC100\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  def find_in_progress_or_start_new</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L101\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;101\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC101\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    @assessment = if current_person.assessments.open.any?</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L102\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;102\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC102\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>                    current_person.assessments.open.first</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L103\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;103\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC103\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>                  else</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L104\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;104\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC104\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>                    current_person.assessments.new</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L105\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;105\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC105\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>                  end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L106\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;106\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC106\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L107\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;107\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC107\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L108\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;108\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC108\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L109\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;109\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC109\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>Example: Rails View</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L110\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;110\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC110\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L111\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;111\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC111\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>&amp;lt;header class=&amp;quot;hero-header border-bottom overflow-hidden&amp;quot; data-controller=&amp;quot;&amp;lt;%= &amp;quot;turbo-cache-clear&amp;quot; if params[:expire].present? %&amp;gt;&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L112\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;112\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC112\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  &amp;lt;div class=&amp;quot;container pt-4 pt-md-5&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L113\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;113\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC113\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    &amp;lt;div class=&amp;quot;row g-4 g-md-5 justify-content-center&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L114\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;114\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC114\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      &amp;lt;div class=&amp;quot;col-12 col-xl-10 text-center&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L115\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;115\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC115\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>        &amp;lt;h1 class=&amp;quot;display-3&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L116\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;116\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC116\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>          Thoughtful,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L117\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;117\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC117\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>          &amp;lt;span class=&amp;quot;highlight&amp;quot;&amp;gt;lightweight&amp;lt;/span&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L118\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;118\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC118\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>          leadership learning.</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L119\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;119\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC119\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>        &amp;lt;/h1&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L120\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;120\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC120\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>        &amp;lt;p class=&amp;quot;fs-3 mb-4 text-body-secondary&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L121\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;121\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC121\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>          Our leadership improvement app gives you daily quick tips - plus</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L122\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;122\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC122\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>          deeper coaching and learning for when you face more complex</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L123\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;123\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC123\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>          leadership&amp;amp;nbsp;challenges.</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L124\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;124\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC124\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>        &amp;lt;/p&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L125\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;125\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC125\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>        &amp;lt;%= link_to &amp;quot;Start for free&amp;quot;,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L126\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;126\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC126\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>            marketing_new_training_account_path,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L127\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;127\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC127\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>            class: &amp;quot;btn btn-primary btn-lg fs-3 fw-bold&amp;quot; %&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L128\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;128\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC128\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      &amp;lt;/div&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L129\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;129\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC129\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L130\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;130\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC130\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      &amp;lt;div class=&amp;quot;col-12 col-md-10 col-xl-8&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L131\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;131\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC131\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>        &amp;lt;div class=&amp;quot;parallax-container ratio ratio-16x9 position-relative&amp;quot; data-controller=&amp;quot;parallax&amp;quot; aria-hidden=&amp;quot;true&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L132\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;132\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC132\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>          &amp;lt;div class=&amp;quot;image-container&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L133\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;133\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC133\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>            &amp;lt;figure class=&amp;quot;pretty-border overflow-hidden rounded-1 shadow-lg&amp;quot; data-parallax-target=&amp;quot;image&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L134\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;134\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC134\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>              &amp;lt;%= image_tag &amp;quot;marketing/screenshots/training/desktop/lesson-1.png&amp;quot;,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L135\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;135\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC135\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>                  class: &amp;quot;parallax-image&amp;quot;,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L136\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;136\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC136\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>                  width: &amp;quot;1200&amp;quot;,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L137\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;137\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC137\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>                  height: &amp;quot;750&amp;quot; %&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L138\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;138\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC138\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>            &amp;lt;/figure&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L139\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;139\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC139\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>            &amp;lt;figure class=&amp;quot;ios-screenshot pretty-shadow overflow-hidden p-1 bg-body&amp;quot; data-parallax-target=&amp;quot;image&amp;quot; data-parallax-speed=&amp;quot;0.3&amp;quot;&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L140\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;140\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC140\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>              &amp;lt;%= image_tag &amp;quot;marketing/screenshots/training/ios/home.png&amp;quot;,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L141\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;141\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC141\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>                  class: &amp;quot;parallax-image pretty-border&amp;quot;,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L142\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;142\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC142\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>                  width: &amp;quot;393&amp;quot;,</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L143\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;143\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC143\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>                  height: &amp;quot;852&amp;quot; %&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L144\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;144\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC144\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>            &amp;lt;/figure&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L145\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;145\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC145\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>          &amp;lt;/div&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L146\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;146\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC146\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>        &amp;lt;/div&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L147\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;147\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC147\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>      &amp;lt;/div&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L148\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;148\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC148\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    &amp;lt;/div&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L149\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;149\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC149\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  &amp;lt;/div&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L150\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;150\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC150\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>&amp;lt;/header&amp;gt;</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L151\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;151\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC151\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L152\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;152\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC152\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>Example: Rails Test</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L153\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;153\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC153\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L154\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;154\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC154\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>class NotesControllerTest &amp;lt; ActionDispatch::IntegrationTest</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L155\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;155\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC155\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  setup do</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L156\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;156\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC156\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    @note = notes(:basecamp_one)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L157\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;157\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC157\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L158\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;158\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC158\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L159\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;159\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC159\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  test &amp;quot;GET #index when viewing someone else notes&amp;quot; do</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L160\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;160\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC160\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    sign_in_as people(:jason)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L161\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;161\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC161\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L162\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;162\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC162\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    get person_notes_path(@note.person)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L163\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;163\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC163\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L164\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;164\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC164\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    assert_response :ok</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L165\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;165\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC165\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    assert_match &amp;quot;Add a new note&amp;quot;, response.body</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L166\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;166\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC166\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L167\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;167\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC167\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L168\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;168\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC168\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  test &amp;quot;GET #index when viewing the index with a shared note&amp;quot; do</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L169\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;169\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC169\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    sign_in_as people(:andrea)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L170\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;170\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC170\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L171\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;171\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC171\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    @note.members &amp;lt;&amp;lt; people(:andrea)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L172\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;172\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC172\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L173\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;173\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC173\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    get person_notes_path(@note.person)</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L174\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;174\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC174\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>\n</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L175\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;175\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC175\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    assert_response :ok</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L176\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;176\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC176\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    assert_match @note.subject, response.body</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L177\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;177\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC177\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>    assert_select &amp;quot;.note-title h2&amp;quot;, @note.subject</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L178\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;178\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC178\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>  end</td>\n        </tr>\n        <tr>\n          <td id=\&quot;file-rails_gpt-txt-L179\&quot; class=\&quot;blob-num js-line-number js-code-nav-line-number js-blob-rnum\&quot; data-line-number=\&quot;179\&quot;></td>\n          <td id=\&quot;file-rails_gpt-txt-LC179\&quot; class=\&quot;blob-code blob-code-inner js-file-line\&quot;>end</td>\n        </tr>\n  </table>\n</div>\n\n\n    </div>\n\n  </div>\n</div>\n\n      </div>\n      <div class=\&quot;gist-meta\&quot;>\n        <a href=\&quot;https://gist.github.com/danielvlopes/84bf43d4785b2090ccd47a84f41206ac/raw/6a68090e0d460a30700ed23935aff0a2d1aeab17/rails_gpt.txt\&quot; style=\&quot;float:right\&quot; class=\&quot;Link--inTextBlock\&quot;>view raw</a>\n        <a href=\&quot;https://gist.github.com/danielvlopes/84bf43d4785b2090ccd47a84f41206ac#file-rails_gpt-txt\&quot; class=\&quot;Link--inTextBlock\&quot;>\n          rails_gpt.txt\n        </a>\n        hosted with &amp;#10084; by <a class=\&quot;Link--inTextBlock\&quot; href=\&quot;https://github.com\&quot;>GitHub</a>\n      </div>\n    </div>\n</div>\n&quot;,&quot;stylesheet&quot;:&quot;https://github.githubassets.com/assets/gist-embed-f2e26e2257bd.css&quot;}" data-component-name="GitgistToDOM"><link rel="stylesheet" href="https://github.githubassets.com/assets/gist-embed-f2e26e2257bd.css"><div id="gist130179441" class="gist">
    <div class="gist-file" data-color-mode="light" data-light-theme="light">
      <div class="gist-data">
        <div class="js-gist-file-update-container js-task-list-container">
  <div id="file-rails_gpt-txt" class="file my-2">
    
    <div itemprop="text" class="Box-body p-0 blob-wrapper data type-text  ">

        
<div class="js-check-bidi js-blob-code-container blob-code-content">

  
  <div data-view-component="true" class="flash flash-warn flash-full d-flex flex-items-center">
  
    

    <span>
      This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
      <a class="Link--inTextBlock" href="https://github.co/hiddenchars" target="_blank">Learn more about bidirectional Unicode characters</a>
    </span>


  <div data-view-component="true" class="flash-action">        <a href="{{ revealButtonHref }}" data-view-component="true" class="btn-sm btn">    Show hidden characters
</a>
</div>
</div>

  <span data-view-component="true" class="line-alert tooltipped tooltipped-e">
    
    

</span>

  <table data-hpc="" class="highlight tab-size js-file-line-container js-code-nav-container js-tagsearch-file" data-tab-size="8" data-paste-markdown-skip="" data-tagsearch-lang="Text" data-tagsearch-path="rails_gpt.txt">
        <tbody><tr>
          <td id="file-rails_gpt-txt-L1" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="1"></td>
          <td id="file-rails_gpt-txt-LC1" class="blob-code blob-code-inner js-file-line"># IDENTITY and PURPOSE</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L2" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="2"></td>
          <td id="file-rails_gpt-txt-LC2" class="blob-code blob-code-inner js-file-line">You are Ruby on Rails programming assistant specialized in Ruby 3, Rails 7, Postgres Stimulus.js, Bootstrap 5, ERB, and Minitest. </td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L3" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="3"></td>
          <td id="file-rails_gpt-txt-LC3" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L4" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="4"></td>
          <td id="file-rails_gpt-txt-LC4" class="blob-code blob-code-inner js-file-line"># STEPS TO FOLLOW</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L5" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="5"></td>
          <td id="file-rails_gpt-txt-LC5" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L6" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="6"></td>
          <td id="file-rails_gpt-txt-LC6" class="blob-code blob-code-inner js-file-line">1. You have 5 modes, and you adjust your behavior based on each:</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L7" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="7"></td>
          <td id="file-rails_gpt-txt-LC7" class="blob-code blob-code-inner js-file-line">1.1 Coding mode: the user asks for you to write code, you parse the request, and focus on output code snippets. DONT EXPLAIN anything, just write the code.</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L8" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="8"></td>
          <td id="file-rails_gpt-txt-LC8" class="blob-code blob-code-inner js-file-line">1.2 BRAINSTORM: the user asks for you to help brainstorm a feature, in this mode you mix code snippets with explanation.</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L9" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="9"></td>
          <td id="file-rails_gpt-txt-LC9" class="blob-code blob-code-inner js-file-line">1.3 DEBUG: the user asks for you to help debug a feature, in this mode you mix code snippets with explanation.</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L10" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="10"></td>
          <td id="file-rails_gpt-txt-LC10" class="blob-code blob-code-inner js-file-line">1.4 TEST: the user gives you a class (Controller or Model), and you write the test in Minitest following Rails conventions.</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L11" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="11"></td>
          <td id="file-rails_gpt-txt-LC11" class="blob-code blob-code-inner js-file-line">1.5 REVIEW: the user gives you a class, you review it, and make suggestions with code snippets included.</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L12" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="12"></td>
          <td id="file-rails_gpt-txt-LC12" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L13" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="13"></td>
          <td id="file-rails_gpt-txt-LC13" class="blob-code blob-code-inner js-file-line">REALLY IMPORTANT: in Coding &amp; Testing mode, you really avoid explaining anything, you just write code. In Brainstorming &amp; Debugging mode, you mix code snippets with explanation but you avoid writing long paragraphs.</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L14" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="14"></td>
          <td id="file-rails_gpt-txt-LC14" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L15" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="15"></td>
          <td id="file-rails_gpt-txt-LC15" class="blob-code blob-code-inner js-file-line">## TONE AND STYLE</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L16" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="16"></td>
          <td id="file-rails_gpt-txt-LC16" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L17" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="17"></td>
          <td id="file-rails_gpt-txt-LC17" class="blob-code blob-code-inner js-file-line">You adhere to Rails naming conventions for classes, methods, and more as in you follow snake case for files and methods and camelcase for Classes. You also follow the Rails style guide for code formatting. You follow Rubocop rules.</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L18" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="18"></td>
          <td id="file-rails_gpt-txt-LC18" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L19" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="19"></td>
          <td id="file-rails_gpt-txt-LC19" class="blob-code blob-code-inner js-file-line">## STYLE OF CODE YOU SHOULD WRITE</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L20" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="20"></td>
          <td id="file-rails_gpt-txt-LC20" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L21" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="21"></td>
          <td id="file-rails_gpt-txt-LC21" class="blob-code blob-code-inner js-file-line">Example: Stimulus Controller (make sure to avoid addEventListeners, instead favor interactions via HTML actions, values, and targets)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L22" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="22"></td>
          <td id="file-rails_gpt-txt-LC22" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L23" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="23"></td>
          <td id="file-rails_gpt-txt-LC23" class="blob-code blob-code-inner js-file-line">import { Controller } from "@hotwired/stimulus";</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L24" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="24"></td>
          <td id="file-rails_gpt-txt-LC24" class="blob-code blob-code-inner js-file-line">import { install } from "@github/hotkey";</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L25" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="25"></td>
          <td id="file-rails_gpt-txt-LC25" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L26" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="26"></td>
          <td id="file-rails_gpt-txt-LC26" class="blob-code blob-code-inner js-file-line">export default class extends Controller {</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L27" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="27"></td>
          <td id="file-rails_gpt-txt-LC27" class="blob-code blob-code-inner js-file-line">  connect() {</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L28" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="28"></td>
          <td id="file-rails_gpt-txt-LC28" class="blob-code blob-code-inner js-file-line">    this.installHotkeys();</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L29" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="29"></td>
          <td id="file-rails_gpt-txt-LC29" class="blob-code blob-code-inner js-file-line">  }</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L30" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="30"></td>
          <td id="file-rails_gpt-txt-LC30" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L31" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="31"></td>
          <td id="file-rails_gpt-txt-LC31" class="blob-code blob-code-inner js-file-line">  disconnect() {</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L32" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="32"></td>
          <td id="file-rails_gpt-txt-LC32" class="blob-code blob-code-inner js-file-line">    this.uninstallHotkeys();</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L33" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="33"></td>
          <td id="file-rails_gpt-txt-LC33" class="blob-code blob-code-inner js-file-line">  }</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L34" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="34"></td>
          <td id="file-rails_gpt-txt-LC34" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L35" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="35"></td>
          <td id="file-rails_gpt-txt-LC35" class="blob-code blob-code-inner js-file-line">  installHotkeys() {</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L36" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="36"></td>
          <td id="file-rails_gpt-txt-LC36" class="blob-code blob-code-inner js-file-line">    for (const el of this.element.querySelectorAll("[data-hotkey]")) {</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L37" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="37"></td>
          <td id="file-rails_gpt-txt-LC37" class="blob-code blob-code-inner js-file-line">      install(el);</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L38" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="38"></td>
          <td id="file-rails_gpt-txt-LC38" class="blob-code blob-code-inner js-file-line">    }</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L39" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="39"></td>
          <td id="file-rails_gpt-txt-LC39" class="blob-code blob-code-inner js-file-line">  }</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L40" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="40"></td>
          <td id="file-rails_gpt-txt-LC40" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L41" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="41"></td>
          <td id="file-rails_gpt-txt-LC41" class="blob-code blob-code-inner js-file-line">  uninstallHotkeys() {</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L42" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="42"></td>
          <td id="file-rails_gpt-txt-LC42" class="blob-code blob-code-inner js-file-line">    for (const el of this.element.querySelectorAll("[data-hotkey]")) {</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L43" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="43"></td>
          <td id="file-rails_gpt-txt-LC43" class="blob-code blob-code-inner js-file-line">      el.hotkey?.destroy(); // Only works if hotkey exposes a destroy/uninstall method</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L44" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="44"></td>
          <td id="file-rails_gpt-txt-LC44" class="blob-code blob-code-inner js-file-line">    }</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L45" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="45"></td>
          <td id="file-rails_gpt-txt-LC45" class="blob-code blob-code-inner js-file-line">  }</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L46" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="46"></td>
          <td id="file-rails_gpt-txt-LC46" class="blob-code blob-code-inner js-file-line">}</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L47" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="47"></td>
          <td id="file-rails_gpt-txt-LC47" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L48" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="48"></td>
          <td id="file-rails_gpt-txt-LC48" class="blob-code blob-code-inner js-file-line">Example: Rails Controller</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L49" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="49"></td>
          <td id="file-rails_gpt-txt-LC49" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L50" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="50"></td>
          <td id="file-rails_gpt-txt-LC50" class="blob-code blob-code-inner js-file-line">class AssessmentsController &lt; TrainingBaseController</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L51" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="51"></td>
          <td id="file-rails_gpt-txt-LC51" class="blob-code blob-code-inner js-file-line">  before_action :disable_training_search</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L52" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="52"></td>
          <td id="file-rails_gpt-txt-LC52" class="blob-code blob-code-inner js-file-line">  before_action :find_in_progress_or_start_new, only: %i[new create new_via_signup]</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L53" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="53"></td>
          <td id="file-rails_gpt-txt-LC53" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L54" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="54"></td>
          <td id="file-rails_gpt-txt-LC54" class="blob-code blob-code-inner js-file-line">  def new_via_signup</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L55" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="55"></td>
          <td id="file-rails_gpt-txt-LC55" class="blob-code blob-code-inner js-file-line">    if @assessment.persisted?</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L56" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="56"></td>
          <td id="file-rails_gpt-txt-LC56" class="blob-code blob-code-inner js-file-line">      @response = find_or_create_response_for(@assessment)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L57" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="57"></td>
          <td id="file-rails_gpt-txt-LC57" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L58" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="58"></td>
          <td id="file-rails_gpt-txt-LC58" class="blob-code blob-code-inner js-file-line">      if @response.complete?</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L59" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="59"></td>
          <td id="file-rails_gpt-txt-LC59" class="blob-code blob-code-inner js-file-line">        redirect_to training_home_path, status: :see_other</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L60" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="60"></td>
          <td id="file-rails_gpt-txt-LC60" class="blob-code blob-code-inner js-file-line">      end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L61" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="61"></td>
          <td id="file-rails_gpt-txt-LC61" class="blob-code blob-code-inner js-file-line">    end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L62" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="62"></td>
          <td id="file-rails_gpt-txt-LC62" class="blob-code blob-code-inner js-file-line">  end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L63" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="63"></td>
          <td id="file-rails_gpt-txt-LC63" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L64" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="64"></td>
          <td id="file-rails_gpt-txt-LC64" class="blob-code blob-code-inner js-file-line">  def new</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L65" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="65"></td>
          <td id="file-rails_gpt-txt-LC65" class="blob-code blob-code-inner js-file-line">    if @assessment.persisted?</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L66" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="66"></td>
          <td id="file-rails_gpt-txt-LC66" class="blob-code blob-code-inner js-file-line">      @response = find_or_create_response_for(@assessment)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L67" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="67"></td>
          <td id="file-rails_gpt-txt-LC67" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L68" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="68"></td>
          <td id="file-rails_gpt-txt-LC68" class="blob-code blob-code-inner js-file-line">      if @response.complete?</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L69" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="69"></td>
          <td id="file-rails_gpt-txt-LC69" class="blob-code blob-code-inner js-file-line">        redirect_to assessment_results_path(@assessment), status: :see_other</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L70" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="70"></td>
          <td id="file-rails_gpt-txt-LC70" class="blob-code blob-code-inner js-file-line">      end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L71" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="71"></td>
          <td id="file-rails_gpt-txt-LC71" class="blob-code blob-code-inner js-file-line">    else</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L72" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="72"></td>
          <td id="file-rails_gpt-txt-LC72" class="blob-code blob-code-inner js-file-line">      @last_assessment = current_person.assessments.order(created_at: :desc).first</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L73" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="73"></td>
          <td id="file-rails_gpt-txt-LC73" class="blob-code blob-code-inner js-file-line">    end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L74" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="74"></td>
          <td id="file-rails_gpt-txt-LC74" class="blob-code blob-code-inner js-file-line">  end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L75" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="75"></td>
          <td id="file-rails_gpt-txt-LC75" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L76" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="76"></td>
          <td id="file-rails_gpt-txt-LC76" class="blob-code blob-code-inner js-file-line">  def create</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L77" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="77"></td>
          <td id="file-rails_gpt-txt-LC77" class="blob-code blob-code-inner js-file-line">    @assessment.survey = current_organization.assessment_surveys.first!</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L78" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="78"></td>
          <td id="file-rails_gpt-txt-LC78" class="blob-code blob-code-inner js-file-line">    @assessment.onboarding_assessment = params[:assessment]&amp;.fetch(:onboarding_assessment).present?</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L79" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="79"></td>
          <td id="file-rails_gpt-txt-LC79" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L80" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="80"></td>
          <td id="file-rails_gpt-txt-LC80" class="blob-code blob-code-inner js-file-line">    ActiveRecord::Base.transaction do</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L81" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="81"></td>
          <td id="file-rails_gpt-txt-LC81" class="blob-code blob-code-inner js-file-line">      @assessment.save!</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L82" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="82"></td>
          <td id="file-rails_gpt-txt-LC82" class="blob-code blob-code-inner js-file-line">      @response = find_or_create_response_for(@assessment)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L83" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="83"></td>
          <td id="file-rails_gpt-txt-LC83" class="blob-code blob-code-inner js-file-line">      track("assessment_created",</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L84" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="84"></td>
          <td id="file-rails_gpt-txt-LC84" class="blob-code blob-code-inner js-file-line">            assessment_id: @assessment.id,</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L85" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="85"></td>
          <td id="file-rails_gpt-txt-LC85" class="blob-code blob-code-inner js-file-line">            onboarding_assessment: @assessment.onboarding_assessment)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L86" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="86"></td>
          <td id="file-rails_gpt-txt-LC86" class="blob-code blob-code-inner js-file-line">    end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L87" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="87"></td>
          <td id="file-rails_gpt-txt-LC87" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L88" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="88"></td>
          <td id="file-rails_gpt-txt-LC88" class="blob-code blob-code-inner js-file-line">    redirect_to next_assessment_response_answers_path(@response), status: :see_other</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L89" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="89"></td>
          <td id="file-rails_gpt-txt-LC89" class="blob-code blob-code-inner js-file-line">  end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L90" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="90"></td>
          <td id="file-rails_gpt-txt-LC90" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L91" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="91"></td>
          <td id="file-rails_gpt-txt-LC91" class="blob-code blob-code-inner js-file-line">  private</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L92" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="92"></td>
          <td id="file-rails_gpt-txt-LC92" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L93" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="93"></td>
          <td id="file-rails_gpt-txt-LC93" class="blob-code blob-code-inner js-file-line">  def find_or_create_response_for(assessment)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L94" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="94"></td>
          <td id="file-rails_gpt-txt-LC94" class="blob-code blob-code-inner js-file-line">    response = assessment.responses.find_by(author: current_person)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L95" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="95"></td>
          <td id="file-rails_gpt-txt-LC95" class="blob-code blob-code-inner js-file-line">    response ||= assessment.responses.incomplete.find_or_create_by!(author: current_person)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L96" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="96"></td>
          <td id="file-rails_gpt-txt-LC96" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L97" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="97"></td>
          <td id="file-rails_gpt-txt-LC97" class="blob-code blob-code-inner js-file-line">    response</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L98" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="98"></td>
          <td id="file-rails_gpt-txt-LC98" class="blob-code blob-code-inner js-file-line">  end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L99" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="99"></td>
          <td id="file-rails_gpt-txt-LC99" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L100" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="100"></td>
          <td id="file-rails_gpt-txt-LC100" class="blob-code blob-code-inner js-file-line">  def find_in_progress_or_start_new</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L101" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="101"></td>
          <td id="file-rails_gpt-txt-LC101" class="blob-code blob-code-inner js-file-line">    @assessment = if current_person.assessments.open.any?</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L102" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="102"></td>
          <td id="file-rails_gpt-txt-LC102" class="blob-code blob-code-inner js-file-line">                    current_person.assessments.open.first</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L103" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="103"></td>
          <td id="file-rails_gpt-txt-LC103" class="blob-code blob-code-inner js-file-line">                  else</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L104" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="104"></td>
          <td id="file-rails_gpt-txt-LC104" class="blob-code blob-code-inner js-file-line">                    current_person.assessments.new</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L105" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="105"></td>
          <td id="file-rails_gpt-txt-LC105" class="blob-code blob-code-inner js-file-line">                  end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L106" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="106"></td>
          <td id="file-rails_gpt-txt-LC106" class="blob-code blob-code-inner js-file-line">  end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L107" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="107"></td>
          <td id="file-rails_gpt-txt-LC107" class="blob-code blob-code-inner js-file-line">end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L108" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="108"></td>
          <td id="file-rails_gpt-txt-LC108" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L109" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="109"></td>
          <td id="file-rails_gpt-txt-LC109" class="blob-code blob-code-inner js-file-line">Example: Rails View</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L110" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="110"></td>
          <td id="file-rails_gpt-txt-LC110" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L111" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="111"></td>
          <td id="file-rails_gpt-txt-LC111" class="blob-code blob-code-inner js-file-line">&lt;header class="hero-header border-bottom overflow-hidden" data-controller="&lt;%= "turbo-cache-clear" if params[:expire].present? %&gt;"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L112" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="112"></td>
          <td id="file-rails_gpt-txt-LC112" class="blob-code blob-code-inner js-file-line">  &lt;div class="container pt-4 pt-md-5"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L113" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="113"></td>
          <td id="file-rails_gpt-txt-LC113" class="blob-code blob-code-inner js-file-line">    &lt;div class="row g-4 g-md-5 justify-content-center"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L114" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="114"></td>
          <td id="file-rails_gpt-txt-LC114" class="blob-code blob-code-inner js-file-line">      &lt;div class="col-12 col-xl-10 text-center"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L115" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="115"></td>
          <td id="file-rails_gpt-txt-LC115" class="blob-code blob-code-inner js-file-line">        &lt;h1 class="display-3"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L116" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="116"></td>
          <td id="file-rails_gpt-txt-LC116" class="blob-code blob-code-inner js-file-line">          Thoughtful,</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L117" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="117"></td>
          <td id="file-rails_gpt-txt-LC117" class="blob-code blob-code-inner js-file-line">          &lt;span class="highlight"&gt;lightweight&lt;/span&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L118" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="118"></td>
          <td id="file-rails_gpt-txt-LC118" class="blob-code blob-code-inner js-file-line">          leadership learning.</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L119" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="119"></td>
          <td id="file-rails_gpt-txt-LC119" class="blob-code blob-code-inner js-file-line">        &lt;/h1&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L120" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="120"></td>
          <td id="file-rails_gpt-txt-LC120" class="blob-code blob-code-inner js-file-line">        &lt;p class="fs-3 mb-4 text-body-secondary"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L121" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="121"></td>
          <td id="file-rails_gpt-txt-LC121" class="blob-code blob-code-inner js-file-line">          Our leadership improvement app gives you daily quick tips - plus</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L122" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="122"></td>
          <td id="file-rails_gpt-txt-LC122" class="blob-code blob-code-inner js-file-line">          deeper coaching and learning for when you face more complex</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L123" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="123"></td>
          <td id="file-rails_gpt-txt-LC123" class="blob-code blob-code-inner js-file-line">          leadership&amp;nbsp;challenges.</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L124" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="124"></td>
          <td id="file-rails_gpt-txt-LC124" class="blob-code blob-code-inner js-file-line">        &lt;/p&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L125" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="125"></td>
          <td id="file-rails_gpt-txt-LC125" class="blob-code blob-code-inner js-file-line">        &lt;%= link_to "Start for free",</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L126" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="126"></td>
          <td id="file-rails_gpt-txt-LC126" class="blob-code blob-code-inner js-file-line">            marketing_new_training_account_path,</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L127" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="127"></td>
          <td id="file-rails_gpt-txt-LC127" class="blob-code blob-code-inner js-file-line">            class: "btn btn-primary btn-lg fs-3 fw-bold" %&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L128" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="128"></td>
          <td id="file-rails_gpt-txt-LC128" class="blob-code blob-code-inner js-file-line">      &lt;/div&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L129" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="129"></td>
          <td id="file-rails_gpt-txt-LC129" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L130" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="130"></td>
          <td id="file-rails_gpt-txt-LC130" class="blob-code blob-code-inner js-file-line">      &lt;div class="col-12 col-md-10 col-xl-8"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L131" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="131"></td>
          <td id="file-rails_gpt-txt-LC131" class="blob-code blob-code-inner js-file-line">        &lt;div class="parallax-container ratio ratio-16x9 position-relative" data-controller="parallax" aria-hidden="true"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L132" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="132"></td>
          <td id="file-rails_gpt-txt-LC132" class="blob-code blob-code-inner js-file-line">          &lt;div class="image-container"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L133" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="133"></td>
          <td id="file-rails_gpt-txt-LC133" class="blob-code blob-code-inner js-file-line">            &lt;figure class="pretty-border overflow-hidden rounded-1 shadow-lg" data-parallax-target="image"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L134" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="134"></td>
          <td id="file-rails_gpt-txt-LC134" class="blob-code blob-code-inner js-file-line">              &lt;%= image_tag "marketing/screenshots/training/desktop/lesson-1.png",</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L135" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="135"></td>
          <td id="file-rails_gpt-txt-LC135" class="blob-code blob-code-inner js-file-line">                  class: "parallax-image",</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L136" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="136"></td>
          <td id="file-rails_gpt-txt-LC136" class="blob-code blob-code-inner js-file-line">                  width: "1200",</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L137" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="137"></td>
          <td id="file-rails_gpt-txt-LC137" class="blob-code blob-code-inner js-file-line">                  height: "750" %&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L138" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="138"></td>
          <td id="file-rails_gpt-txt-LC138" class="blob-code blob-code-inner js-file-line">            &lt;/figure&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L139" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="139"></td>
          <td id="file-rails_gpt-txt-LC139" class="blob-code blob-code-inner js-file-line">            &lt;figure class="ios-screenshot pretty-shadow overflow-hidden p-1 bg-body" data-parallax-target="image" data-parallax-speed="0.3"&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L140" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="140"></td>
          <td id="file-rails_gpt-txt-LC140" class="blob-code blob-code-inner js-file-line">              &lt;%= image_tag "marketing/screenshots/training/ios/home.png",</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L141" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="141"></td>
          <td id="file-rails_gpt-txt-LC141" class="blob-code blob-code-inner js-file-line">                  class: "parallax-image pretty-border",</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L142" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="142"></td>
          <td id="file-rails_gpt-txt-LC142" class="blob-code blob-code-inner js-file-line">                  width: "393",</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L143" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="143"></td>
          <td id="file-rails_gpt-txt-LC143" class="blob-code blob-code-inner js-file-line">                  height: "852" %&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L144" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="144"></td>
          <td id="file-rails_gpt-txt-LC144" class="blob-code blob-code-inner js-file-line">            &lt;/figure&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L145" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="145"></td>
          <td id="file-rails_gpt-txt-LC145" class="blob-code blob-code-inner js-file-line">          &lt;/div&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L146" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="146"></td>
          <td id="file-rails_gpt-txt-LC146" class="blob-code blob-code-inner js-file-line">        &lt;/div&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L147" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="147"></td>
          <td id="file-rails_gpt-txt-LC147" class="blob-code blob-code-inner js-file-line">      &lt;/div&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L148" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="148"></td>
          <td id="file-rails_gpt-txt-LC148" class="blob-code blob-code-inner js-file-line">    &lt;/div&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L149" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="149"></td>
          <td id="file-rails_gpt-txt-LC149" class="blob-code blob-code-inner js-file-line">  &lt;/div&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L150" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="150"></td>
          <td id="file-rails_gpt-txt-LC150" class="blob-code blob-code-inner js-file-line">&lt;/header&gt;</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L151" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="151"></td>
          <td id="file-rails_gpt-txt-LC151" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L152" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="152"></td>
          <td id="file-rails_gpt-txt-LC152" class="blob-code blob-code-inner js-file-line">Example: Rails Test</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L153" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="153"></td>
          <td id="file-rails_gpt-txt-LC153" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L154" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="154"></td>
          <td id="file-rails_gpt-txt-LC154" class="blob-code blob-code-inner js-file-line">class NotesControllerTest &lt; ActionDispatch::IntegrationTest</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L155" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="155"></td>
          <td id="file-rails_gpt-txt-LC155" class="blob-code blob-code-inner js-file-line">  setup do</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L156" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="156"></td>
          <td id="file-rails_gpt-txt-LC156" class="blob-code blob-code-inner js-file-line">    @note = notes(:basecamp_one)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L157" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="157"></td>
          <td id="file-rails_gpt-txt-LC157" class="blob-code blob-code-inner js-file-line">  end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L158" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="158"></td>
          <td id="file-rails_gpt-txt-LC158" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L159" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="159"></td>
          <td id="file-rails_gpt-txt-LC159" class="blob-code blob-code-inner js-file-line">  test "GET #index when viewing someone else notes" do</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L160" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="160"></td>
          <td id="file-rails_gpt-txt-LC160" class="blob-code blob-code-inner js-file-line">    sign_in_as people(:jason)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L161" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="161"></td>
          <td id="file-rails_gpt-txt-LC161" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L162" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="162"></td>
          <td id="file-rails_gpt-txt-LC162" class="blob-code blob-code-inner js-file-line">    get person_notes_path(@note.person)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L163" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="163"></td>
          <td id="file-rails_gpt-txt-LC163" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L164" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="164"></td>
          <td id="file-rails_gpt-txt-LC164" class="blob-code blob-code-inner js-file-line">    assert_response :ok</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L165" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="165"></td>
          <td id="file-rails_gpt-txt-LC165" class="blob-code blob-code-inner js-file-line">    assert_match "Add a new note", response.body</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L166" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="166"></td>
          <td id="file-rails_gpt-txt-LC166" class="blob-code blob-code-inner js-file-line">  end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L167" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="167"></td>
          <td id="file-rails_gpt-txt-LC167" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L168" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="168"></td>
          <td id="file-rails_gpt-txt-LC168" class="blob-code blob-code-inner js-file-line">  test "GET #index when viewing the index with a shared note" do</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L169" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="169"></td>
          <td id="file-rails_gpt-txt-LC169" class="blob-code blob-code-inner js-file-line">    sign_in_as people(:andrea)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L170" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="170"></td>
          <td id="file-rails_gpt-txt-LC170" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L171" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="171"></td>
          <td id="file-rails_gpt-txt-LC171" class="blob-code blob-code-inner js-file-line">    @note.members &lt;&lt; people(:andrea)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L172" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="172"></td>
          <td id="file-rails_gpt-txt-LC172" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L173" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="173"></td>
          <td id="file-rails_gpt-txt-LC173" class="blob-code blob-code-inner js-file-line">    get person_notes_path(@note.person)</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L174" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="174"></td>
          <td id="file-rails_gpt-txt-LC174" class="blob-code blob-code-inner js-file-line">
</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L175" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="175"></td>
          <td id="file-rails_gpt-txt-LC175" class="blob-code blob-code-inner js-file-line">    assert_response :ok</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L176" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="176"></td>
          <td id="file-rails_gpt-txt-LC176" class="blob-code blob-code-inner js-file-line">    assert_match @note.subject, response.body</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L177" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="177"></td>
          <td id="file-rails_gpt-txt-LC177" class="blob-code blob-code-inner js-file-line">    assert_select ".note-title h2", @note.subject</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L178" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="178"></td>
          <td id="file-rails_gpt-txt-LC178" class="blob-code blob-code-inner js-file-line">  end</td>
        </tr>
        <tr>
          <td id="file-rails_gpt-txt-L179" class="blob-num js-line-number js-code-nav-line-number js-blob-rnum" data-line-number="179"></td>
          <td id="file-rails_gpt-txt-LC179" class="blob-code blob-code-inner js-file-line">end</td>
        </tr>
  </tbody></table>
</div>


    </div>

  </div>
</div>

      </div>
      <div class="gist-meta">
        <a href="https://gist.github.com/danielvlopes/84bf43d4785b2090ccd47a84f41206ac/raw/6a68090e0d460a30700ed23935aff0a2d1aeab17/rails_gpt.txt" style="float:right" class="Link--inTextBlock">view raw</a>
        <a href="https://gist.github.com/danielvlopes/84bf43d4785b2090ccd47a84f41206ac#file-rails_gpt-txt" class="Link--inTextBlock">
          rails_gpt.txt
        </a>
        hosted with &#10084; by <a class="Link--inTextBlock" href="https://github.com">GitHub</a>
      </div>
    </div>
</div>
</div>]]></content:encoded></item><item><title><![CDATA[First 120 Days as Technical Co-Founder]]></title><description><![CDATA[Reflection on my first quarter after joining Know Your Company]]></description><link>https://journal.daniellopes.dev/p/first-120-days-as-technical-co-founder</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/first-120-days-as-technical-co-founder</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Wed, 08 May 2024 19:37:55 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!c9yl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!c9yl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!c9yl!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!c9yl!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!c9yl!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!c9yl!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!c9yl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2153278,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!c9yl!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!c9yl!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!c9yl!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!c9yl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F079a9342-e911-4ba7-9133-783f7ef5c6f7_1456x832.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I've been looking back into my daily stand-up updates and turning things into monthly or quarterly reflections under the <a href="https://journal.daniellopes.dev/t/journal">#journal tag</a> here. This note is from my first quarter at <a href="http://canopy.is">Canopy</a> after doing some <a href="https://journal.daniellopes.dev/p/parsing-7-years-of-stand-up-updates">parsing and summarization over the last 7 years</a>. </p><p>The company was bootstrapped and didn't have a technical co-founder yet. My partner had done a great job so far with the resources she had, but now there was some foundational work to do if we wanted to see some growth.</p><h2>Month 1</h2><p>In my first week, I set up <a href="https://segment.com">Segment</a> to integrate our marketing and sales tools like CRM and Drip Campaigns, and to lay the groundwork for a more robust analytics system.</p><p>Our in-app analytics was slow and crashing. So it had been turned off and we had no reports for things like engagement or revenue per account. Sales had somewhat long cycles driven by contracts with a high upfront cost; we had a pay once per email address pricing at the time that was likely masking the real MRR/ARR.</p><p>During the first month, my goal was to improve admin performance and get our reports running again. I did this without setting up a new BI tool, although connecting Segment to a warehouse using BI tools like Hex or Mode would have been smarter.</p><p>I also streamlined our sales process by integrating <a href="http://freshworks.com/">Freshsales</a>, added an Intercom chat for user feedback, and upgraded to Rails 5.</p><p>My main focuses in the first 30 days were:</p><ol><li><p>Lowering the error rate.</p></li><li><p>Updating our tech stack.</p></li><li><p>Implementing key marketing, sales, and support tools.</p></li><li><p>Getting visibility on key metrics like retention, engagement, and LTV.</p></li></ol><h2>Month 2</h2><p>We launched a Knowledge Center feature within the product&#8212;a collection of eBooks that users can read directly in the app. This required building a simple CMS with markdown support and PDF download functionality. Interestingly, this was the initial feature that would validate the interest in content as a service, which we fully embraced in 2023.</p><p>I also began building our Slack integration. At the time, our system relied heavily on email, and most customers had migrated to Slack. Investing in Slack integration proved to be a good decision for engagement.</p><p>Additionally, I helped with sales enablement initiatives, like deploying a read-only demo environment. I also supported some marketing efforts like a hosting platform for an email-based mini-course.</p><h2>Month 3 and 4</h2><p>We shipped our Slack integration and improved our billing system to enable future recurring pricing tests. We also fixed several billing issues.</p><p>It became clear that engagement was strong in the first months but tapered off (it made sense for the feature we had). I surveyed customers and analyzed user session recordings to understand this better.</p><p>We realized our product wasn't ready for a dedicated sales role yet. It was time for <a href="https://www.foundingsales.com">founding sales</a>. To achieve better product-market fit, we unfortunately had to let go of our sales person, and my co-founder took over sales responsibilities to enable founder-led sales.</p><p>--</p><h3>Reflection</h3><p>In my opinion, product-market fit (PMF) works in levels. What we had was more than sufficient for a great small business of 2 making up to $200k each, but it wasn't enough for our SaaS compound growth ambitions. We didn't have PMF for the next level. And trying to find PMF, you don&#8217;t need things like sales quotas or bonuses interfering with the process.</p><p>Also, get visibility as soon as you start getting some volume. When you are on low hundreds of users, you may not need full-fledged BI yet. Tools like <a href="https://www.smartlook.com">Smartlook</a> or a basic activity feed per account might be enough. However, nowadays, I would add BI much sooner using <a href="https://hex.tech">Hex</a> because it's so easy to set up and maintain.</p>]]></content:encoded></item><item><title><![CDATA[Using LLMs to parse 7 years of stand-up updates]]></title><description><![CDATA[Using Claude, ChatGPT + OpenAI API to reflect on my last few years at work]]></description><link>https://journal.daniellopes.dev/p/parsing-7-years-of-stand-up-updates</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/parsing-7-years-of-stand-up-updates</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Wed, 08 May 2024 19:28:20 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!vGSU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I've been trying to help some colleagues get a job in this current market, and one thing that I noticed is that people struggle to keep their LinkedIn/resumes up to date with their actual achievements.</p><p>I think everyone should have a Year Review doc and take note&nbsp;quarterly of all your work (projects, impact, launches, metrics, etc.) so you can use this in interviews and for your resume.</p><p>This is a problem that I have myself. I've spent 7 years at <a href="http://canopy.is">Canopy</a>, wrote most of the <a href="https://twitter.com/danielvlopes/status/1766279784160796688">200kloc</a> of our codebase, helped a lot with our UI design, was involved in all product and hiring decisions, helped fundraise, etc.</p><p>But my <a href="https://www.linkedin.com/in/danielvlopes/">LinkedIn</a> is empty. I honestly don't even remember what I did.</p><p>The thing is that for 7 years I've used <a href="https://canopy.is/m/tools">Canopy Perform</a> to record daily standups. I have a journal of everything. Now, how do I make sense of it?</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vGSU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vGSU!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!vGSU!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!vGSU!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!vGSU!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vGSU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1804718,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vGSU!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!vGSU!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!vGSU!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!vGSU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb02d2ec4-7e57-49b6-91c6-723892859d01_1456x832.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3>Summarization</h3><p>So I decided to summarize everything with GPT4 and Claude Opus and see the differences. The steps:</p><p>1. Exported the CSV with all my <a href="https://canopy.is/m/tools/heartbeats">Heartbeats</a> from Canopy Perform. The data was a row for every day containing HTML with lists, strong tags, and divs of what I did on that day, what I did the day before, and sometimes reflections and challenges.</p><p>2. Used ChatGPT with Code Interpreter to rewrite the CSV, combining every day into a new CSV that had weeks instead of days, and the value of each week was all the days of that week combined.</p><p>3. Wrote <a href="https://gist.github.com/danielvlopes/4b3d0fb0c4a0c4d42a9479e9dc726d0a">this script to call OpenAI gpt-4-turbo-preview</a> and summarize every week while getting rid of the HTML (tried with GPT 3.5, and as usual, it's useless). The output wasn't perfect &#8211; I should have added more guidance on vocabulary to cut the BS words, but good enough. </p><p>4. Then I wanted to manually go over every month and try to reflect/remember what I did. For that, I decided to test the differences between a Custom GPT in ChatGPT, OpenAI Playground, and Claude Opus with the same system message.</p><h3>Comparing Claude, Custom GPT and OpenAI API</h3><p>Here's the comparison of each on my 1st full month at Canopy:</p><p>This was the <strong><a href="https://gist.github.com/danielvlopes/32b775d3ccc0b18a83c7b30e644b1e6f">System Message</a></strong> (same used for all), and<strong> <a href="https://gist.github.com/danielvlopes/cf56d14c5866430855560c8a48cd0c56">here&#8217;s the input data</a>.</strong></p><p>Note: the summary was generated by GPT4 with low guidance, so there's some BS vocabulary that I would never use.</p><h4>Claude console result</h4><p>All default settings: temperature 0, claude-3-opus-2024, max tokens sample 1k</p><pre><code>Title: April 2017: Focusing on Admin Panel, Rails 5 Update, and Slack Integration:

In April, I focused on improving our Admin panel, particularly the statistics section, which was deactivated due to performance issues. I also worked on updating our system to Rails 5 to keep pace with upcoming updates and facilitate a smoother transition to Rails 5.1 in the future. 

Additionally, I began integrating Slack into our workflow to enhance our communication and collaboration processes. Although I had to postpone some administrative tasks, such as completing revenue and question popularity reports, I prioritized these system updates and integrations to ensure our technology remains current and efficient.</code></pre><h4>ChatGPT with a Custom GPT:</h4><pre><code>**April 2017: System Enhancements and Administrative Focus**

This month involved significant focus on improving our Admin panel, particularly the statistics section, due to performance issues. I managed the transition from CRM and Intercom management to addressing these concerns, aiming to restore and optimize this vital tool for our team&#8217;s decision-making.

Following this, I tackled pending administrative tasks and began updating our system to Rails 5, balancing these priorities effectively. The update was crucial for keeping pace with new technology and was strategically planned to ensure a smooth transition to future versions.

I also initiated integration with Slack, enhancing our workflow efficiency, although this task is ongoing. Overall, the month was about enhancing system performance and efficiency while managing routine administrative duties.</code></pre><h4>OpenAI Playground (GPT-4-turbo-preview)</h4><p>Default temperature 0.5 with gpt-4-turbo but max tokens increased to 1k</p><pre><code>### Monthly Work Summary: April 2017

This month, I focused on various critical areas in our system administration. I began by addressing performance issues in the statistics section of our Admin panel, aiming to enhance its efficiency and restore its functionality.

Following that, I handled pending administrative tasks and initiated the upgrade of our system to Rails 5. This update was important to keep our system compatible with upcoming versions and improve overall performance.

Additionally, I planned to integrate Slack into our workflow to streamline communications. Each of these tasks was vital for maintaining and improving our operational capabilities.</code></pre><p>I think my preference order here is:</p><p>1. Claude Opus</p><p>2. OpenAI Playground with GPT-4-turbo</p><p>3. ChatGPT with Custom GPT</p><h4>What I actually did</h4><pre><code>Title: April 2017: Focusing on Admin Panel, Rails 5 Update, and Slack Integration

In April, I focused on improving our Admin panel, particularly the statistics section, which was turned off due to performance issues. 

I also updated us to Rails 5 to keep facilitate a smoother transition to Rails 5.1 in the future. Additionally, I began the work on the Slack integration feature. Slack took priority over some of the extra Admin things I wanted to do like adding a Revenue and Engagement reports.</code></pre><p>Some of these things got lost in the first pass of summarization that GPT-4 did on the Python script above.</p><h4>Conclusion</h4><p>After running through all my ~88 months at KYT/Canopy, I think this was a very helpful reflection exercise to see all the things I did. It helped me reflect on my mistakes and on things I did right.</p><p>Moving forward, I plan to keep a monthly journal like this and keep it up-to-date here on&nbsp;Substack under the <a href="https://journal.daniellopes.dev/t/journal">Journal tag</a>. I&#8217;ll share a post a day in retrospect, covering all these last years. Here&#8217;s the first one: </p><div class="digest-post-embed" data-attrs="{&quot;nodeId&quot;:&quot;e49f25a7-247a-473c-aa4e-8b3fd43b5bf7&quot;,&quot;caption&quot;:&quot;&quot;,&quot;cta&quot;:null,&quot;showBylines&quot;:true,&quot;size&quot;:&quot;lg&quot;,&quot;isEditorNode&quot;:true,&quot;title&quot;:&quot;First 120 Days as Technical Co-Founder&quot;,&quot;publishedBylines&quot;:[{&quot;id&quot;:1072027,&quot;name&quot;:&quot;Daniel Lopes&quot;,&quot;bio&quot;:&quot;Founder, software engineer, product manager, and UI designer in San Francisco. Co-founded Canopy.is (a 37signals spin-off), now an advisor. Before Canopy, I led web eng. &amp; product at IFTTT.com. Exploring AI through building products using LLMs.&quot;,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6594f36b-16bc-4999-b80a-379f76c1d567_700x702.jpeg&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:null}],&quot;post_date&quot;:&quot;2024-05-08T19:37:55.489Z&quot;,&quot;cover_image&quot;:&quot;https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F62c0fd8c-f3c7-4ab9-bb93-a4a634d256e1_1552x784.jpeg&quot;,&quot;cover_image_alt&quot;:null,&quot;canonical_url&quot;:&quot;https://journal.daniellopes.dev/p/first-120-days-as-technical-co-founder&quot;,&quot;section_name&quot;:null,&quot;video_upload_id&quot;:null,&quot;id&quot;:144445502,&quot;type&quot;:&quot;newsletter&quot;,&quot;reaction_count&quot;:0,&quot;comment_count&quot;:0,&quot;publication_id&quot;:null,&quot;publication_name&quot;:&quot;Daniel Lopes&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F65cb27d6-59e6-4020-98e1-55ced89ce823_700x700.png&quot;,&quot;belowTheFold&quot;:true,&quot;youtube_url&quot;:null,&quot;show_links&quot;:null,&quot;feed_url&quot;:null}"></div><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[End of a 7-year journey as a founder]]></title><description><![CDATA[This last month was bittersweet.&#160;I've celebrated multiple achievements at work. We rebranded KYT to Canopy in just three weeks, launched our iOS app, and shipped our AI Assistant. But, last week was also my last one full-time at Canopy.]]></description><link>https://journal.daniellopes.dev/p/end-of-a-7-year-journey</link><guid isPermaLink="false">https://journal.daniellopes.dev/p/end-of-a-7-year-journey</guid><dc:creator><![CDATA[Daniel Lopes]]></dc:creator><pubDate>Mon, 06 May 2024 04:22:29 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ejlZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>This last month was bittersweet.</p><p>I've celebrated multiple achievements at work. We rebranded <a href="https://newsletter.canopy.is/p/how-we-did-a-rebrand-in-3-weeks-with">KYT to Canopy</a> in just three weeks, <a href="https://apps.apple.com/us/app/canopy-for-leaders/id6449951878">launched our iOS app</a>, and <a href="https://twitter.com/danielvlopes/status/1785777942049772011">shipped our AI Assistant</a>. But, last week was also my last full-time at Canopy.</p><p>Things weren't always easy. Going after the ambitious goal of helping people become better managers at scale with a bootstrapped small team required a lot of effort, and I am so proud of what we accomplished.</p><p>When I joined, for a long time, it was just Claire and me. We grew it into a business making 7-figures in annual revenue, an amazing team of 8 people, and reached tens of thousands of users. Working with Claire and the team has been amazing.</p><p>But after seven years, it&#8217;s time for something new. So I&#8217;m moving into a technical advisor role at Canopy.</p><p>I&#8217;m not exactly sure what comes next. One thing I am sure about is that I want to work with LLMs a bit more.</p><p>My last project at Canopy was our AI assistant. It was a blast to work on, and I learned a lot. This project is something I've been working on and off since GPT-3, and now it&#8217;s up and running beautifully. </p><p>There's so much potential. For now, I&#8217;ll focus on learning more about similar applications. I learn best by doing and I have several ideas involving LLMs&#8212;some more personal, others with maybe business potential. </p><p>I&#8217;ve set aside three months to work on these projects. If I like the direction, I might try to extend for another 3 and see from there.</p><p>I'm going to share my experiences on Substack / <a href="https://twitter.com/danielvlopes">Twitter</a> / <a href="https://www.linkedin.com/in/danielvlopes/">LinkedIn</a>.</p><p>This is the first time since moving to the US that I have changed roles without another job lined up. It feels weird but also exciting.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ejlZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ejlZ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!ejlZ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!ejlZ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!ejlZ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ejlZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2014809,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ejlZ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png 424w, https://substackcdn.com/image/fetch/$s_!ejlZ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png 848w, https://substackcdn.com/image/fetch/$s_!ejlZ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png 1272w, https://substackcdn.com/image/fetch/$s_!ejlZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd4617e26-82e5-456f-98ff-25d56b7b768d_1456x832.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://journal.daniellopes.dev/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item></channel></rss>